Posts Tagged ‘data’

Have you ever wanted to change the world through digital technologies? Over 100 participants had this in mind when they descended upon Imperial College London last weekend to take part in the UK’s first Urban Prototyping (UP London) Hackathon.

Multi-disciplinary teams of developers, programmers, technicians and designers competed for a chance to win over £100,000 worth of awards including up to £80,000 cash in follow on funding. Teams were challenged to create a technology based prototype that would result in real-world changes to either the environment, local economy or local community.

But what is a Hackathon?

Simply put, a Hackathon allows teams of hackers to ‘hack’ large data sets (such as weather / transport / traffic data) over a short amount of time, in this case one weekend. It’s the job of these teams to unravel and translate this data into a useable application that engages citizens.

A good example of this was at Urban Prototyping Singapore Hackathon in 2012 where one team of hackers were able design an algorithm that interpreted live car park traffic data. The team created ‘SurePark’, a mobile application allowing users to reserve parking spots in the city centre (the same way you would book seats in a cinema). The applications predictive modeling allows users to book the next available slot and can also predict pending rush-hour traffic in specific lots within a 24-hour period.

Imperial’s Digital City Exchange research programme played a vital role in the UP London’s Hackathon primarily by deploying WikiSensing, the Discovery Sciences Group and DCE sensor data management platform.

This was the first time WikiSensing had been demonstrated outside of the College and gave Hackathon teams the platform to access and retrieve queries from the massive datasets being hacked. Orestis Tsinalis, Digital City Exchange Research Assistant from Imperial College London’s Department of Computing said “It was a great experience opening up WikiSensing to the world at the UP London festival. We had valuable interactions with the participants in the Hackathon and the Crackathon events, and got a good grasp about the kinds of applications that can be built on top of our platform.”

With prizes included follow on funding as well as an all-expenses paid trip to Shanghai to present their concepts at the Smart City Forum of CIOs at Mobile Expo Asia in June, completion was fierce teams diligently working well into the early hours.

Primarily it’s important for the team to set realistic goals. From experience, a large number of competitors tend to get bogged down in finalising their prototypes. In most cases teams will be lucky to get something that actually works, rather than a finished product ready to be marketed to the masses. The point is to aim high but ensure that prototype v0.1 can be developed within a 48 hour window.

After a quick battery recharge teams reconvened Sunday morning to begin shaping their ideas into a working prototype. Ideas developed and pitched at the Hackathon ranged from crowd-sourced crisis maps to apps for easing landlord-tenant discussions. David Birch, Digital City Exchange Research Associate said: “There was a surprising breadth of proposals from concept pitches to the live demos. Challenges addressed ranged from lifesaving fire fighting sensors to helping tenants know the right thing to do when a pipe bursts in their flat, each aiming to improve the resilience of modern life to unexpected events using technology.”

After 48 hours of hacking, gaming, designing and with over 800 cups of tea and coffee consumed the judging panel were presented with prototypes such as “Project Glass” – a voice based payment mechanism using face-recognition/geolocation to “Suppa Power” – smart power monitoring & learning power habit, empowering consumers to ACT upon smart meters .

Judges, included representatives from Digital Shoreditch, CIKTN, Tech City, TSB, RCUK and the GSMA, were clearly impressed with the quality of the prototypes and with so many broad designs found it difficult picking a winner. Judge Kam Star from Digital Shoreditch said “the true success of these events are measured by the outcomes, we are delighted that so many amazing projects and ideas surfaced at the event and will be followed up.”

One teams lucky enough to secure £6,000 worth of follow on funding designed a sensor, small enough to fit inside a firefighter’s helmet, that monitors and warns firefighter of temperature surges. Sharp changes in temperature, such as a when a fire finds a fresh supply of oxygen, can be deadly and an early warning system could prove crucial to saving lives.

Ross Atkin from the winning team said “Given that firefighters already had audio alarms and that you don’t want to be confusing them visually, we needed a signal that would get through even if they were really stressed,” – “The positioning of it [the alarm] on the back of the neck was because we needed somewhere where the device could be exposed to ambient temperatures, but we had a reasonable route to a relatively sensitive part of the body.”

Post Hackathon came UP Londons Crackathon held on Monday 22nd April 2013. The Crackathon explored how secure and resilient to attacks urban digital technologies are. The main aim was to understand these issues and identify the research challenges in order to generate trust and long-term sustainability of ICT within urban areas.

Led by Dr Zeynep Gurguc, Digital City Exchange Research Associate the Crackathon challenged attendees to defend databases against malicious hacks. The three hacks set were as timed, prepared and full event challenges and tested the participants ability to create defences that were 100% impenetrable.

The Crackathon identified a family of cyber attacks far more difficult to detect than that of their counterparts identifying where extra funding and resources should be focused.

To conclude, both the Hackathon and Crackathon were hugely successful in terms of attendance, creativity, feedback and on the day buzz. With over 100 participants Hackathons are an ideal way of networking and engaging with the local tech community. While the organisation of these events can be demanding the real challenge comes with securing the ‘right’ kind of data. Teams will have much more space to be innovative when large amounts of new and diverse data exist.

The Hackathon was part of the Urban Prototyping London festival which was funded through the Digital Economy Networks. Led by Dr Catherine Mulligan UP London is held and between the 8th April and 26th June 2013. UP London expects to host over 300 developers, architects, designers, artists and technology specialists investigating the role of digital technologies in creating smart sustainable cities.

 

 

 
 

Working together on smart cities

July 5, 2012
by Richard Foulsham

Ovum-DCE Smart Cities Europe 2012

The Lancaster, London 19-20 June 2012

You can find the Chirpstory for the event here.

In many ways the event revealed the broader problems with discussions around smart cities. There is the aspirational vision – cleaner, less-congested, less polluted and more prosperous cities – contrasted with the complex reality of current “smart” ICT projects, often mired in difficulties around business models, administrative jurisdiction, privacy and security issues and any number of other complex multi-stakeholder problems that crop-up when you try and integrate the physical and digital worlds; problems which go far beyond the scope of a simple technological fix.

The day started with an intoduction by Larry Hirst of the Digital City Exchange and Imperial College and Neelie Kroes of the European Commission, and a laying-out-of-issues by David Gann, the principle investigator of the Digital City Exchange.  The vision of the Digital City Exchange is to create the equivalent of a telephone exchange for a city’s data. This platform will then be accessed by citizens, businesses and city administrators to assist decision making, create products and services and inform city management. The key point about this exchange is that it seeks to be an exchange for all types of sectoral data: energy, transport, health, waste, environmental and any other areas you can think of to place a sensor. This goes far beyond the sectoral approach we see in many projects.

The imperative for such projects was underlined by Manel Sanromá, CIO of the City of Barcelona who pointed out that the human race is becoming steadily more urban, a process that has been going on for millennia. A result of this fact is that it is the quality of life that is available in the cities that is going to determine how we live in the future, because although you “can’t guarantee that France, the United Kingdom and the United States will be around in a thousand years, you can be virtually certain that Paris, London and New York will be”.

As such, cities themselves, through offices such mayors and other municipal offices that currently seem to be  undergoing a renaissance, are going to be the source of the impetus for moving to a smarter urban future.

Interesting themes that emerged from subequent sessions included:

The human element: the unpredictability of human nature and the risk of making any broad predictions about how the “human agent” will react when embedded in a smart city. A point raised in both the Transport session by panellists Sue Flack, Anders Roth and Jeremy Green, and by Nilay Shah in his “View from the Top” session as he tried to imagine what a smart city would look like using the tools of process engineering.

What is a smart city?: This is likely to differ from city to city, but what are the essential elements and what is the essential infrastructure needed before you can even think about calling yourself one?

Governance: we may use phrases such as “managing the smart city”, but more often the level of decision making is unclear, a hierarchy and decentralisation are often suggested but we still don’t know who to go to for a particular kind of decision.

Who will pay?: Something of an old warhorse in debates about almost any topic, but one that is particularly uncertain when business models are as contested as they are in the digital environment. A point made strongly by Allan Mayo of the UK’s Department for Business, Innovation and Skills who said the assumption that it was all going to be paid for by advertising was naïve.

The smart cities agenda suffers from a certain amount of tension between a bottom-up versus top-down approach. The former is responsive, but limited in scope by barriers between sectors and the latter is slow to develop and ill-defined, but necessary if the full potential of the agenda is to be realised. Hopefully the Digital City Exchange will go some way to filling this lacunae.


 

 

 

Friday 20 April 2012, Senate House, University of London

By Koen van Dam

CASA, the Centre for Advanced Spatial Analysis at UCL chose “Smart Cities: bridging physical and digital

Download the PDF file .

as the theme of their conference held at the Senate House in Central London on the 20th of April 2012. Smart cities and digital… of course DCE had to be there.

Prof Michael Batty, chair of the management board of CASA, opened the conference by going back 40 years in the past to highlight the many advances of the digital age, most notably the rise of the internet, and how they have changed life in cities. Batty went on to explain that cities can be considered as networks of connected computers, and that smart cities present planners with new challenges because they address the short term operation of the places we live in rather than long-term strategies. The question is now what the next 40 years will hold. The presentations at this conference might offer a glimpse of that future. The real challenge, according to Batty, is that after the transition from “real” to “digital”, we now have to move back from “digital” to “real” and to see the changes of digital services on daily life.

The first speaker of the conference was Prof Carlo Ratti, Director of MIT’s SENSEable City Lab. He provided an image which would turn out to be a metaphor central to the whole day: the idea that a F1 racing team can no longer just win races by having the best car and the best driver, but that real-time data processing of all things happening in the car and on the track is absolutely essential now. Bringing this back to cities: they have become control systems (also strengthening Batty’s claim about short term operation being more central). Prof Ratti spent most of his talk introducing wonderful examples of analysis of data, digital design and real applications, ranging from analysing telecom signals during the world cup football, measuring drought by counting green pixels in digital photos, interactive walls made of water to early prototypes of the Copenhagen Wheel, a way to store power in a bicycle while also providing the rider with real-time data on for example air quality. Finally, Ratti made very clear that in the past key publications were made by one author from one discipline, but that today the most influential articles are in fact written by many authors coming from various scientific backgrounds.

The other speakers of the day were all researchers at CASA. Jon Reades and Joan Serras gave a great overview of the analysis of transport data (e.g. Oyster journeys on London’s transport network) with many beatiful maps. James Cheshire and Martin Austwick looked at cycle hire schemes, showing how much we can learn and infer from just looking at the status of docking stations around the city without having to know exactly which trips were made. One of the key advantages of using OpenStreetMap (instead of for example Google maps) is that you get a lot more information about the usability of the roads for, for example, cyclists, and the team took this into account in their routing algorithms.

Next up was a slightly controversial, but highly interesting talk by Prof Sir Alan Wilson, who developed a model of the riots in London August 2011. Using epidemic models (c.f. Epstein) they try to replicate the attractiveness of certain site to rioters and looters, taking for example the number of police officers present into account to determine the chance for arrest. One of the members of the audience, quite rightfully, pointed out that in some riots a high police presence is actually the cause for escalation instead of serving as a way to keep people under control. Also, the highly political impact of research would make it hard to develop objective simulation models. Still, this was one of the few examples at the conference of using data to inspire predictive models and using them for decision support.

George MacKerron showed us Mappiness, a way to study how people feel in certain locations based on an iPhone app which asks people to state their own happiness after having been prompted at random times. Analysis of this data could show in which environment people feel most happy.

Richard Milton discussed data stores and real-time data, raising the all-important question of data analysis: how to find something you don’t know was there. Organising data by spatial elements, it becomes, for example, possible to overlay population density maps with energy infrastructures for gas and electricity, showing striking matches. Maptube a tool developed at CASA for mapping data sets and building mash-ups, was introduced after which Steven Gray took over the microphone to present another tool offered by CASA, namely GEMMA engine for mapping. The idea is that these tools could become a “Big Data Toolkit” for visualisation and analysis of data. As a perfect illustration of the kind of events we might be able to infer from data, a lady on the tube wondering “it’s not usually this busy, what happened??” made clear that we need predictions as well as data analysis to get a better overview of systems.

Andy Hudson-Smith, CASA director, showed the Tales of Things project, explained as the “Internet of Second-hand Things”, paraphrasing the concept of the Internet of Things. Oxfam used this approach in their charity shops, allowing people to tell stories about the objects they were donating.

The final speaker at the conference was mapping guru Oliver O’Brien. He showed the CityDashboard displaying in real-time a number of views of a city, enabling the user to discover if there is something “wrong” (e.g. by linking travel disturbances, weather and popular news items) and help to make the right decisions on for example departure time or routing. Perhaps a bit disappointingly, the dashboard does not contain any “intelligence” or predictions, but the presentation gave a valuable overview of the data used as well as the standards in which they are made available, and the API would enable others to build on top of this dashboard by accessing the same information. Furthermore, Oliver highlighted that in order to compare different cities, more standards are required.

During lunch and tea breaks, participants were able to see the brilliant visualisations, apps and games developed by the CASA researchers up close and personal. Many of the stands were interactive, and we flapped our arms as we flew over London, we watched air planes circle over the city, placed police units in riot hot spots, looked at agent-based models of people moving about, and studied how pedestrians changed their behaviour as we adjusted the urban footprint of their virtual world using wooden blocks. Furthermore, it was a chance to stare at the recently released map of every bus trip in London and several other impressive visualisations at the border of art and science.

The conference ended with a panel discussion, in which several of the speakers took place to discuss issues such as which real insights can we get from visualisation and which sectors they predict would be the next big application after transport (which was clearly the most studied subject). In answering these questions, the panel members discussed the gap between sensing to activating and put forward health and justice (Alan Wilson), education (Ratti) and retail (Batty) as future domains to study.

On a final note, two statements of great importance to our Digital City Exchange programme were made during the panel discussion. Batty stressed that while we have a lot of data, we still know very little about the processes underlying decisions and actions of people. Hudson-Smith then said that what is needed next is an integrated model bringing together things that happen in different sectors. We are working on it!

 

 

 
 

The future of IT

March 6, 2012
by Richard Foulsham

Wednesday 8th March, Imperial College Business School

By Richard Foulsham

Lem Lasher, as Group President of Global Business Solutions and Chief Innovation Officer of CSC is in a unique position to assess the influences upon and potential future direction of the digital economy, having both the analytical capabilities of a leading edge consultancy and a business that’s involved in the day to deployment of IT around the world within his remit.

During this lecture he shared with us some of the “points of view” that his organisation has developed around “next practice”, likely developments that clever players in the IT market can use to gain an advantage, and identified some of the areas that he thinks will grow and become important as the digital revolution gets underway.

This revolution in technology has been brought about by the growth of the internet. Mr Lasher predicts that we are barely into the foothills of this revolution with another 20 years of change ahead of us. This development will consist of  a relatively predictable advance in the technology, but Mr Lasher sees the real source of disruption being the development in business models that will take place in this new, connected environment.

The drivers of the current market as Mr Lasher sees it are globalisation and everything that overworked word entails – increased competition, the rise of China and India – but more interestingly he also identified a consistently difficult regulatory environment as being something that is affecting companies ability to differentiate their products. One interesting thing to ponder is whether the technological revolution and the  process of globalisation are really parallel or dynamically entwined. It is hard to imagine the economic growth of India without the business space that has been opened up by technology. Equally, the globalised markets that are the bain of many democratic politicians’ existence have grown up around the opportunities offered by a digital market place.

Organisations have responded to these challenges by becoming more complex and adopting a greater variety of forms than has been the case previously. Mr Lasher identifies a process that could be described as a “democratisation of technology” which has affected the way technology advances:  technologies are available at low enough cost and can be operated by non-experts – the iPad and the child are given as an example and anyone whose watched toddlers playing Angry Birds will find it hard to disagree – meaning that public bodies and private corporations cannot dictate to the market, but rather they have to respond to the market in a way that keeps revenue intact and minimises risks from regulation.

These effects are not consistent across sectors, and Mr Lasher then went on to describe a matrix for predicting the degree of business model disruption likely to result from these changes in a particular industry. He identifies two major predicting factors: whether the organisation deals in physical product or data, and the degree of regulation in the industry. This model has a remarkable predictive power which two examples will suffice to illustrate. The music industry – no physical product and little regulation (or at least little ability to enforce what regulation exists) – has gone through an exceptionally torrid time recently, whilst banking, dealing largely in data, but doing so  in a highly regulated environment, has itself suffered little in the way of disruption whilst, ironically, wreaking havoc on the rest of the economy.

Mr Lasher ended his presentation by describing a number of specific areas that he sees as likely to become increasingly important in the future. Some are predictable, others more of a surprise.  There was also a list of things that may prove to be the downside of the bright, shiny digital future that glistens enticingly at us from the cover of a thousand company brochures.  Whatever happens, it’s not going to be boring.

You can access a recording of Mr Lashers presentation here.

 

 

 
 

Day 2 – Digital Engagement 2011 #de2011

November 28, 2011
by Claire Thorne

Thursday 17th November 2011, St. James’ Park, Newcastle

By Claire Thorne

Just in case you were exhausted from Day 1, or you weren’t quite paying attention at 9 am, Prof Don Marinelli was on hand. Delivering his keynote, entitled ‘A Curriculum for the 21st Century: Storytelling, Architecture, Technology & Experience , with all the gusto and drama of (a State-side) Brian Blessed, Don spoke and we all listened. He presented the innovative and multidisciplinary Master of Entertainment Technology – focusing on Storytelling, Architecture, Technology and Experience – at Carnegie Mellon University which he co-founded (watch co-founder Randy Pausch’s ‘last lecture’). The course abandons all traditional, formal teaching methods, valuing ‘edu-tainment’ and choosing to view “education as business”, boasting Star War’s C-3PO amongst its Faculty. In practice, this means a questionable non-curriculum of zeppelin rides and white-water rafting, students owning all IP and Don enforcing a somewhat brutal ‘no scholarship rule’.

Don’s examples of MET outputs included MyStoryMaker (software designed to encourage children into Carnegie Library to write, rather than borrow, books) and synthetic interviews for bringing late, scientific legends ‘back to life’. Don’s vision of the future, “making Computer Science a performing art”, includes progress in the areas of augmented reality, 4D immersive experiences and casual gaming.

[Dates for your diary: The 2012 Digital Economy All Hands, hosted by dot.rural, the Aberdeen research Hub, will take place at the Aberdeen Exhibition and Conference Centre on October 23-25 2012.]

Dr Dominic Price’s (Horizon Digital Economy research Hub) contribution to the Crowd-sourcing session, entitled ‘A Framework for Crowd-Sourcing Personal Data’, introduced the Datasphere application as a ‘personal container . The Datasphere offers a way for individuals to track and manage access to their personal information, maintaining privacy levels by granting selective access in response to third party ‘queries’.

In the Open Data and Security session, Dr Andrew Garbett presented Lincoln University’s work on ‘Using social media to drive public engagement with open data . Referring to the ‘HM Government 2011 Making Open Government Data Real: a public consultation publication and the importance Government places on engagement with data for new revenue streams, Andrew emphasised “the need for public services to interface with this [crime, NHS, travel and transport] raw data”. The London Live Tube Map, the London Bike Share Map and Mash My Gov were just a few example applications Andrew mentioned where the service is good but not quite tailored to the user. Andrew’s work on FearSquare – where UK crime statistics (based on location habits) and social media are combined for a user-personalised local crime app – echoed many of the applications showcased at DE All Hands 2010 (e.g. VoiceYourView ) and at the recent Silicon Valley Comes to the UK appathon . FearSquare raised some concerns from the audience; namely the developer’s responsibility to stop perpetuating negative connotations of Open Data, reinforced by nomenclature like FearSquare.

The afternoon session on ‘Support Services for Assurance and Reassurance’ spanned the topics of privacy, energy and access. When presenting on ‘Privacy Preserving Personalisation via Dataware, Dr James Goulding declared “in the Digital Economy, data is currency” before featuring… quilting! James then went on to categorise the current market place as an Oligopoly with just two to three major players dominating each service sector, leaving little/no motivation for innovation. James’ future work will be based on combining Dataware (a Chrome application which builds a model of you, based on your interest areas) with Horizon’s Geostore. In the same session, Ian Dent presented Horizon’s work on ‘Creating Personalised Energy Plans’ and the DESIMAX project, demonstrating strong links with Low Carbon London. As Ian and his colleagues jostle with 20-year old data sets, he appealed for access to data and suggested opportunities for collaboration.

Unfortunately the quick-fire session, like the workshops, offered few exciting updates on (repackaged) work featured at last year’s DE All Hands meeting . Meanwhile, there were just a couple, rare glimpses of Social Science research and any realised/projected Impact(s) in the DE space, i.e. the work and context that promise to put the ‘society’ in the Digital Economy.

So, what’s the verdict? Two (and a half) days later and I’m left wondering: Where’s the ‘global’ in all of this? Where’s the ‘economy’? Cue Digital City Exchange (paper

Download the PDF file .

, poster (low resolution) [jpg]).