Experts predict an increase of 4,300% in annual data generation by 2020. From facts and figures about our shopping habits and how we travel, to energy meters in our homes and data on the rubbish we throw out, this information could help society be more sustainable.
But evidence suggests that modern cities also place a high value on privacy and digital security. Some argue that the scale of this unprecedented flow of information undermines urban anonymity, and the 2015 Economist Safe Cities Index incorporated a digital security metric alongside traditional measures of safety such as personal security and health. In developing smart cities as the new paradigm, authorities and businesses will need to negotiate this delicate balance. Do, and should, urban populations have a choice in their information being sourced and aggregated? What degree of privacy invasion will be tolerated?
“Privacy has become a tool of the trade. We pay for personalised services with our data,” says Jarmo Eskelinen, a Finnish data privacy specialist and founder of the Forum Virium Helsinki innovation lab.
“We are increasingly using digital platforms to manage different aspects of our lives, from payments to fitness. They are great; cost-effective, easy to use, available everywhere. All those services need our data to work. They also capitalise on our data. As users, we don’t have a proper view of where our information is and how it is being used. The landscape is getting increasingly complex, and we are lacking both the skills and tools of proper data management.”
Eskelinen, who in April took up the post of chief innovation and technology officer at the London-based Future Cities Catapult, is also vice chair of the Open and Agile Smart Cities Network. This initiative’s 31 member cities – including urban hubs in Finland, Italy, Spain and Brazil – have signed up to promote open, smart cities based around the needs of communities. Forum Virium helped Helsinki pioneer the use of open data via the company’s MyData concept which is designed to give individuals more control over data management.
“Privacy has been seen mostly as a legal matter. In the digital age, we should also approach it as a technical and usability challenge,” explains Eskelinen. “MyData gives individuals the rights to access the data collected about them by developing open standards and tools for management of personal data.”
In the human-centric MyData model, privacy comes first, with Eskelinen noting: “Individuals should be empowered actors not passive targets.” He also calls for the nurturing of transparent business environments to make it easier for companies to comply with data protection regulations and allow people to change service providers without proprietary data lock-ins.
Mark Deakin, a professor of the built environment at Edinburgh Napier University and author of From Intelligent to Smart Cities, believes further development of digital rights may be forced when privacy concerns swell to critical mass.
“While the degree of tolerance is not known, as soon as individual experiences are no longer the exception but the rule, the ‘rage against the machine’ will spill over to become more than the sum of the parts. Subsequently, the significance this registers within society shall become a matter of real public concern, calling, not only for a greater level of data protection, but stronger trust agreements between provider and user.”
So do people have a choice in their information being sourced and aggregated? “In principle people do have a choice, in reality not much,” says Eskelinen. “The terms of use of digital services are mostly on or off. If you don’t let the service access your data, it’s almost useless. ‘I’ve read the terms and conditions’ is the most common lie we make in the digital domain.”
With the many and tangible benefits that smart technology brings, are people really that fussed about privacy?
“There is a growing distrust towards companies and the government about keeping our data safe,” notes Eskelinen. “Some of the problematic areas are digital surveillance, poor standards and structures for healthcare data, and racist or sexual harassment, which is a threat especially to young people who often post stuff online carelessly.”
Fabiano Vallesi, strategy researcher at wealth management firm Julius Baer, agrees it’s difficult for consumers not to be wary. “From healthcare providers to banks, retailers or utility providers, every sector has been hacked,” he says. “Consumers online have experienced cyber-threats in the form of malware and identity theft. They are increasingly facing the challenge of determining which companies to trust in holding their personal information amidst this escalation in data breaches.”
In particularly sensitive areas, such as health or insurance, concerns about privacy being breached could even be slowing down the development of potentially beneficial smart services, Eskelinen adds.
In navigating a middle ground between the benefits of big data and the resulting loss of privacy and autonomy, more sophisticated encryption could prove key, says Deakin. It should either remove or encrypt all traces of personal information, he says, and render anonymous what is used. Perfecting this sort of technology, he suggests, could ensure data can be used only in relation to the application in question and not recycled for any other – potentially murky – purpose.
Content on this page is paid for and produced to a brief agreed with Julius Bär, sponsor of the what if? economics hub