In part one of this short series on smart cities and privacy, I looked at the driving forces and factors behind the development of the smart city. Factors such as environmental change, technological advances, and increasing need for sustainability, as cities become the home of over half the population of the planet. I also explored the technologies making the smart city a reality. Technologies behind the ethos of smart water, smart buildings, and smart transport. All of these innovations require data in some form or another, and much of it personal data from ourselves. These data may be directly relatable to an individual, or it may be part of an indirect, but aggregated, data set, that can be used to identify an individual. In addition, the types of data needed to allow the smart city to ‘breath,’ can also build up a very intricate and detailed picture, not just about our identity, but about our life habits. Data such as what time we wake up, what time we go to bed, and what we do in between, is part and parcel of a smart city thinking mechanism to create a more individually tailored, healthy, and sustainable way of living.
In this second part of this series, I will look at the implications of this data collection and how it impacts our personal privacy. The big question:
…is our personal privacy the price we have to pay for more sustainable and comfortable city living, and will it be a price worth paying?
Privacy – Do We Care?
The notion of privacy of information is often confused with security, so let’s try and get down a definition of what it actually entails before I begin.
When we talk about data privacy we are talking about the fundamental human right to feel secure. Every society on the planet has its own cultural view of how privacy is determined and what it means for an individual.
The U.S. Constitution, for example, has no mention of privacy, per se, but the Fourth Amendment states that:
right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures…
The United Nations Universal Declaration of Human Rights sets out in Article 12 that:
No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation.
In Europe, the General Data Protection Regulation (GDPR) which goes into force on 25, May 2018, is built upon the notion that privacy is a basic right and states at the outset that:
The processing of personal data should be designed to serve mankind.
But what about us, as individuals, where does privacy figure in our life? Do we really care if details of our car journey to work are picked up and stored in a Cloud server if we receive a better driving experience and we can find a parking space?
A number of research papers looking into the attitude of an individual to personal privacy have been published, below are a few:
Altruism and privacy: A paper from Koç University in Turkey explored attitudes towards privacy across age ranges, on social media platforms like Facebook, as an indicator of ‘privacy-related behavior’. Privacy management, in the context of this paper, was about: “people’s control over the circulation of personal information, comprising utilization of strategies (also called privacy rules) to control individual and/or group boundaries”. The study concurred with other studies in finding that older social media users had a greater tendency towards protecting their privacy than younger users. An interesting finding was the attitude towards the ‘privacy of others’ aka not your own privacy – older adults seemed to have greater consideration for the privacy of others than younger people. One explanation given for this is that younger users had a greater number of periphery contacts in their networks than older users – closeness to a contact being a deciding factor in how altruistic the feeling of privacy was. Which leaves us with the possible thought that building a more reciprocally altruistic privacy culture might help towards policing privacy exploits.
Culture and privacy: Privacy attitudes may have a cultural overlay that should be taken into consideration when designing a smart city. Cultural attitudes cut across every aspect of a given society and should not be ignored when applying technology to any societal context. In a study of privacy attitudes and behavior within Germany, carried out by the University of Hohenheim, 80% of survey respondents felt they had the right to control what personal information was made public. The report also found that 66% of German people were highly concerned about disclosure of PII on the Internet and most of the respondents were concerned about data collection procedures connected to Internet usage.
In a similar report carried out on British attitudes towards privacy, the results concurred with the German report, in that 79% of Brits had concerns about online privacy, but disagreed, somewhat, with the Koc university findings in that younger people showed more concerns about privacy than their older counterparts; in the survey, however, younger people were more prepared to sacrifice their privacy for a perceived benefit.
A Carnegie Mellon University, cross cultural survey on privacy attitudes on social networking sites, looking at China, USA, and India, found significant cultural differences. In the U.S. there was a much greater concern over privacy, followed by the Chinese and Indian respondents showing lesser concern. The differing results were put down, in part, to cultural differences in terms of how individualistic each society was. However, other factors, such as raised awareness because of reported incidents, was also deemed a factor. Again, reading into the results, having a perceived benefit, such as being able to meet new people, was seen as a reason to be more relaxed about personal privacy.
Education and privacy: Privacy attitudes have changed since the issues raised in the case of Snowden and U.S. government surveillance. A number of high profile cases, post-Snowden, such as the Austrian privacy campaigner, Max Schrems have highlighted privacy issues. Schrems brought a case against Facebook questioning the safety of their data transfers and privacy violations. These types of cases coupled with the wide variety of data breaches has helped to raise public awareness of digital privacy.
This raising of public awareness is a trickle that may become a stream as we move more closely towards hyperconnected city living. The words ‘data privacy’ no longer only fall from the lips of the tech community. Children onwards, are all aware of the fluid nature of online personal data. The education process has already started, but the full implications of uncontrolled data exposure may not yet be fully understood.
How Cybercrime Has Changed the Face of Privacy and Weaponized Personal Data
Privacy related crimes are already impacting individuals across the world. Let’s look at some numbers. The ‘wall of shame’ as it’s known, is the U.S. Department of Health and Human Services Office for Civil Rights (OCR) list of healthcare data breaches that fall under the HIPAA regulations for notification of exposed personal data. For the year to date, there were just under 4.5 million exposed personal data records.
The Breach Level Index maintained by security firm Gemalto, shows that in 2016, 44 personal data records were stolen every second of the day. Whenever we navigate to a news site or switch on the TV news, there is inevitably some mention of a data breach involving the personal data of individuals, often involving tens or even hundreds of millions of records exposed.
The situation is out of control, with security firms scrabbling to build more intelligent security tools to help stop the onslaught. The situation has come about because of the need to share personal data with a myriad of organizations to get online and transact once you are online. Social media has encouraged our personal ‘open door policy’ of sharing. Online shopping has made our lives easier, and eGovernment has driven us to interact, even at governmental level, by sharing our personal data, online.
The smart city will only work by increasing data generation, across a large matrix that represents everything about us. Data sharing models will be used even more to build a beneficial environment to live and work in. This, in turn, will amplify the chances of personal data exposure and worse, personal data misuse and abuse. Exposure of data that can identify an individual is one thing, but weaponizing data is another sinister aspect that a smart city can bring about. In the 4 vignettes below, we will explore the other side of data exposure.
The young person: Cyberstalking reaches new levels when Man-in-the-Middle becomes Stalker-in-the-Middle. The smart city feeds off aggregated data collected about a person’s whereabouts and other identifying data such as age and sex. This could be used to facilitate both on and offline stalking.
The teacher: An angry pupil hacks their teacher’s IoT device and uses it to spy on them. IoT devices are becoming common home appliances that will be an integral part of our smart city living. IoT devices like the Amazon Echo Spot are built to connect up devices across the home and beyond. Often voice activated and with integral cameras, if these devices are hacked, there is potential for them to become intrusive.
The employee: Smart discrimination in the work place. Wearables that collect information about our lifestyle and health are becoming commonplace and will have a role in smart city health. Some companies are already using wearables to keep track of an employee’s movements, health, and even personal life. Without the right regulatory controls, it could be difficult for an employee to say no to the collection and sharing of such data with an employer. Surreptitious discrimination could follow.
Cheating partners: Tracking of movements could be used as blackmail bait. Congestion and resulting pollution are the scourge of our modern city living. Smart cities need to manage these by using a myriad of technologies such as those that track cars, eventually, including fully autonomous vehicles. The benefits are the creation of feedback data that can be used to plan our days and have a better driving experience. Tracking of smart cars, however, can have a more intrusive side. Exposure of data could allow a blackmailer to focus on people who may be cheating on their partners by spotting vehicles driving into known red light areas.
The Data Collector: Sometimes it isn’t about individual data sets and the exposure of those that is worrying. It is when disparate data comes together, as a whole, to give revealing information, that can really affect a person’s life. There are certain types of attacks that can effectively identify an individual, even if the data is only partially known or de-identified. This was shown to be feasible by Harvard University Data Privacy Lab. Their research showed how easy it was to re-identify individuals using a combination of news items about hospitalizations and publicly available datasets. A technique known as a ‘statistical disclosure attack’ can be used to aggregate data across disparate mix systems, meant to anonymize communications and disclose patterns and ultimately re-identify an individual. Researchers in the UK have looked at how large clinical data warehouses, of the type that are likely to be employed in a smart city, can be at risk of data exposure using this technique – the report also looks at control methods for protecting the privacy of patients.
Predictive modeling of behavior, over long periods of time, is used by marketers to target ads at customers. This can have consequences beyond a typical privacy breach, as in the case of Target Corp. The company used this behavior to target promotional products for pregnant women to one man’s teenage daughter. The man rang Target, annoyed at them harassing his daughter with items he clearly thought were inappropriate. The man unfortunately was to find out that his young daughter, was, in fact, pregnant. What is particularly disturbing about this, apart from a father finding out his daughter was unexpectedly pregnant, is that a faceless organization could predict a life event and subsequent behavior from undisclosed behavior.
Technologies Role in Privacy Violations Within a Smart City
Data is the blood of the smart city, but technology is its backbone. Certain technologies have implications for our privacy within the smart city and the examples below perhaps best show the implications of important enabling technology and how it can be addressed:
Drones: Drone technology offers the smart city a central control unit that can be applied to a number of use cases, from traffic control to agriculture. These ‘eyes in the sky’ can be thought of as a mobile, swarm-like, computer system. A computer system that is built to collect data on everyday smart city operations. For example, there is a drone system from Airwave that is used to watch over worksites to access work progress. The company also offer drones to assess residential properties for insurance claims. In a paper by the Cloud Security Alliance which looked at the implications of drones on privacy in the smart city, they identified a number of key areas that need to be addressed to ensure that privacy violations do not become an afterthought when keeping drone manufacture and management costs down. The alliance has set out a series of 8 security goals to protect the privacy and security of personal data when using drones, these include: securing drones from hijacking, and keeping citizen imagery and personal information private (see also targeted marketing below).
Biometrics: Facial recognition in the city is already here – we even have it on certain smartphones. At the UK’s Notting Hill carnival, the Met police used facial recognition in an attempt to stamp out trouble before it happened. In the USA, the Ministry of Internal Affairs of Georgia are using facial recognition surveillance as part of their safe city initiative. In the smart city, biometric systems like facial recognition are used to provide marketing analytics to shops. NEC, who provide the technology for the Georgia system, have a solution that uses facial recognition to analyze shopper demographics and can spot how long a given person looks at a particular signage. The resulting data can then be used to tailor shopping experiences, target market to individuals, and optimize shop layout. Facial recognition may help to drive features within a smart city, but we need to consider the privacy implications of this most intimate of identifiers. The Electronic Frontier Foundation believes that the FBI already has at least 52 million photos in its Next Generation Identification (NGI) facial recognition database. Other sources say they are amassing considerably more than that. One of the privacy issues around facial recognition is in the protection of these data. The NGI, for example, has been found severely wanting in terms of carrying out privacy impact assessments and in the accuracy of data. If facial recognition is being used to enforce the law within a smart city, you’d want to make sure that your face wasn’t mistaken for that of a criminal. Privacy advocates have other concerns about facial recognition including that we, as individuals, should expect a:
“reasonable expectation of privacy” which should include a “reasonable expectation of anonymity from government use of computer algorithms and databases to capture law abiding citizens’ faces and identify them without their knowledge or consent.”
Targeted marketing and behavioral tracking: Marketing is a hard business to be in and you have to use every trick in the book to compete. In a survey from Smart Insights, they found that big data and AI were the second and third most impactful areas in marketing in the coming year. The smart city will be able to take marketing to new levels based on the data it collects from facial recognition, your life history, behavior patterns, and data collected from your use of IoT devices. Dutch Train company NS is already doing this using Smart Billboards. The billboards have camera software that is able to determine the person’s sex and age and push relevant ads at that person. This may be a breach of data rights; NS are stating that no identifiable data is being stored and so no privacy is breached, but the act of determining this aspect of your person is not offered as an ‘opt out’.
Artificial Intelligence and surveillance: AI is not a new technology, explorations of its uses go back decades, with examples such as the use of neural networks in chemistry being used in research back in the 1980s. The smart city is based on data, and AI is an important tool for analyzing trends and patterns in complex data sets. Data and AI are an important pairing for the smart city. Montreal City is using AI, satellites, drones, and LiDAR (a sensing method based on light detection and ranging) to improve traffic congestion. But to do so, they have to mount the LiDAR units on city vehicles and know every move of the vehicle. If you then link the vehicle to the person and you have a clever tracking device.
One sensor to rule them all: One of the biggest issues in the privacy of personal information is in how it is collected. We, as individuals, lead complex lives. Those lives are made up of many types of data, some are directly identifiable and static, like your age, other data is identifiable but dynamic, like your address, others still are transactional – did you shop for groceries on such a day and time. Bring all this information together and you have a very detailed view of a person. The smart city will ultimately benefit from data aggregation. Better accuracy will derive better outcomes. Marketers have known this for years, and the smart city that wants to create better living and working environments, building sustainable futures, will need aggregated data to take the guesswork out of planning. Aggregation of data is seen as an opportunity and challenge by Carnegie Mellon who are working on a single sensor that aggregates data across the home to make living more efficient. The sensor indirectly monitors all appliances (smart and non-smart) within a given space. You can see that this method is an attractive alternative to disparate units that do not communicate seamlessly. The smart city can benefit from using a centralized sensor system, at least across related areas, however, this sensor will be used to aggregate the data output across the living space of an individual – it will know your every move.
Smart health: Being a healthy individual is an intrinsic part of a healthy smart city. Global health costs are showing a general upward trend according to World Bank figures. As our city population grows, sustaining, and even improving healthcare, will require smart thinking. City living also brings with it health challenges, from pollution, communicable diseases, autoimmunity issues, and other health problems like obesity and longer life. Systems like the OpenAPS which supports the development of apps that utilize data from Diabetes 1 patients, and wearables which monitor your heart to help predict and prevent health related issues, have implications for privacy (as well as security). Health applications using data generated by our body also include the remote measurement of heartbeats, performs by measuring the slightest color change in your face spotting vascular health from a distance. It even has the potential to spot if you are attracted to a person – using the same cues we already use with our own senses.
Health data is particularly sensitive. It has the potential, if exposed, to be used for blackmail or even discrimination.
Smart Marketing in the Smart City – A Privacy Nightmare
Certain aspects of smart city living and privacy are already starting to hit home. One of these is marketing. We are all now acutely aware of using social platforms, like Facebook, and having, often adverts pushed onto our desktop. Targeted marketing is the bugbear of just about everyone and with the smart city, this may just become even more acute. In the film of the sci-fi book by Philip K. Dick, Minority Report, there was a scene where the main character was walking through a shopping mall, trying to be inconspicuous. However, every-time he looked at a billboard a ‘relevant’ ad would pop up and call out his name. That vision of a dystopian future is here and now. Targeted marketing is still fairly clumsy. The type of ads we see pop up when we visit a website based on systems like Google ad ID are often out of date – pushing back to us products we may already have bought or discounted. However, this is all changing with data aggregation, machine learning, and AI – the ads of the future will truly know who we are.
As technology becomes smarter, the vulnerabilities become more sophisticated. Marketing is about individuals more than it is about groups. The marketer wants to personalize and tailor your shopping experience. To do this, they need to know you within a given context. AI used with large datasets to predict trends and patterns, gives them that context, as well as personal information about you, allowing the personalization capability needed by modern competitive marketing. Marketing may well be a primary driver of data collection at a very personal level, and as such, needs to have special focus tuned upon it.
Smart Cities, Smart Governance and the Privacy Officer
We have discussed the ways that data and technology converge in the smart city, governance of these data is an essential control mechanism for the privacy of these data.
Personal data can be used for both good and bad. It can be used to discriminate, carry out criminal activities, blackmail, and stalk. Used unwisely, it can prevent social mobility, and increase poverty within society. Human beings are renowned for taking things out of context and corporate bodies, especially so. Uncontrolled data access could result in insurance being unavailable for all but the very wealthy, healthy, and privileged in society. Uncontrolled data access could even result in the loss of civil liberties.
Smart city dystopia doesn’t have to be the case if we add a layer of smart governance to control privacy. Smart data governance is about involving all of the stakeholders in decisions about data. A paper on the subject of smart governance talks about the importance of ‘smart collaboration’ in creating an environment of achievable smart governance. These building blocks can form the basis of an environment which promotes respect for privacy. The paper explores the subject through the eyes of three Brazilian smart cities looking at areas such as power balances when utilizing personal data, transparency, and decision making. The paper does not address privacy, as such, but the use of data governance for smart city privacy is attracting attention. Gartner suggests creating a “smart city data governance plan” again based on the ethos of collaboration and transparency to ensure that privacy considerations are acted upon.
The Privacy Pillar of the Smart City
In the 90s, ex Canadian Privacy Commissioner Ann Cavoukian described the ethos of ‘Privacy by Design’ (PbD). In her treatise on the 7 foundational principles of PbD she set out how to build systems that had privacy baked into their core. One of these principles was ‘respect’ for privacy based on a user-centric position. These principles can, and should, be applied to the design and building of our smart cities. In doing so, we will at least endeavor to create privacy respectful smart cities. Having the forethought to place privacy as a pillar of smart city design will go some way towards ensuring that personal data cannot be used against the owner. How we achieve this technically is a challenge, but one that governments and vendors of technologies that are intrinsic in smart city creation must take on.
At the beginning of the first part of this series I wrote out the lyrics penned by Janis Joplin, “Don’t it always seem to go, that you don’t know what you’ve got til it’s gone”. Losing our privacy for the price of more sustainable cities does not have to be a given. If we plan out privacy considerations at the outset, we can have respectful policies and technology that enables these policies to have a private life and a good one too.
Personal privacy is something that once lost, may be difficult to pull back. The loss of liberty may seem a long way off in liberal western cultures, but history shows us that human behavior shows swings in political leanings and trust in governments. The scientific discipline of cliodynamics, which applies a mix of evolutionary science, economics, and history to human society and behavior, has shown trends in cooperation in society that touches all layers, from political leaders down. The specter of an authoritarian government replacing our liberal western expectations is not an Orwellian dream, but a real possibility – if we have innocently signed away our data privacy rights because of trust in the status quo, there are no guarantees that our personal data will not be used against ourselves or our family members.
The smart city train has already left the station. In a world where the population is increasing and resources are finite, we need to find smarter ways of living together. Smart technologies hold the key to making this happen but we need to proceed with caution and build a layer of respectful trust and privacy in our smart places.
Marin Ivezic is a Cybersecurity & Privacy Partner in PwC Canada focused on risks of emerging technologies.
Luka Ivezic is an independent consultant and author exploring geopolitical and socioeconomic implications of emerging technologies such as 5G, Artificial Intelligence (AI) and Internet of Things (IoT). To better observe policy discussions and societal attitudes towards early adoptions of emerging technologies, Luka spent last five years living between US, UK, Denmark, Singapore, Japan and Canada. This has given him a unique perspective on how emerging technologies shape different societies, and how different cultures determine technological development.