It is vital to establish and communicate widely on clear standards for “anonymising” personal data collected by Smart City actors (= municipalities and their licencees). This needs to come as an adjunct to the existing European GRPD provision.

     27 Sept. 2019

by Thiago Goncalves, Product Marketing Manager, New Mobility, HERE

When anonymised data is anything but

Whenever our data is ‘anonymised’, it has been manipulated in such a way that it can no longer be linked back to us, right? Wrong. The experts from HERE share some practical advice on safeguarding privacy

In the digital era, we do not just blend into the crowd, especially when our location data is part of the equation. This is because our mobility habits are unique to us. Our route to the office, from door to door, is different from anyone else’s, and our traces are highly identifiable with only a few location data points.

Technologies like anonymisation offer service providers a wealth of ways to protect their users’ privacy while retaining enough data to serve some utility. However, the challenge is finding that balance. As regulations tighten and people become more privacy conscious, organisations that handle data must get smarter about handling data.

The truth behind location data anonymity

As you move through the world, your devices are not necessarily limited to generating single points of separated data. Rather, they can create a linked set of data points that are more than the sum of their parts. Travelling from place to place produces a whole sequence of locations and timestamps that come together to resemble a path on a map.

That sequence, which we call a trajectory, can be particularly revealing. This is why location data privacy is so tricky. A company which is tracking you can remove your personal data from the data points and trajectories which they would later make public. However, anyone can potentially add their own insights or other related publicly available data to those published trajectories and use that combination of data to identify an individual.

This is not theoretical. You only need to look at what happened when the New York City Taxi and Limousine Commission released a supposedly anonymised dataset of taxi rides. Or when a student was able to locate military bases in the Middle East through anonymised data from a fitness application.

What is apparent is that removing your personal information from location data does not automatically make you anonymous. It is little wonder then that privacy advocates worry that similar events will occur again.

Protecting citizens’ privacy

Many privacy incidents are, of course, unintentional. Any entity that provides consumer data to outside parties, including HERE, is capable of inadvertently providing information, which may enable the identification of the people from whom it originates.

Cities, transport agencies and road operators may rest easier if data is confined internally to improve their own services or guide decision-making. However, many will also share data with third-party organisations or open it up to the public, and that calls for more careful consideration of the privacy payloads involved.

In practice, this means building privacy considerations into new services from the beginning and limiting data collection to only what you need. It also calls on cities to take a look at new data anonymisation and other privacy-enhancing methods. Novel technologies, such as differential privacy and federated machine learning, offer powerful new ways for cities to enhance their citizens’ privacy, while simultaneously preserving and maximising the utility of their data.

There is no one-size-fits-all when it comes to privacy, which is why organisations are likely to employ a mix of methods to meet their and their users’ needs.

This is an edited extract from: When anonymised data is anything but: protecting citizens’ privacy in the age of urban mobility, an eBook published by HERE Technologies

Share This