What location data reveals about you

Location data allows rich insights into private details about our lives: where we live and work, how we spend our free time and what is important to us, says Nina Wiedemann. This poses a considerable security risk – even if we have nothing to hide.  
Whereabouts can tell a lot about a person's life: Location data is therefore particularly worth protecting. (Image: Diki prayoga / AdobeStock)

Anyone who uses apps on a smartphone and surfs the internet generates a wealth of seemingly irrelevant data, which we usually share with third parties either unknowingly or carelessly. Every click and every activity, no matter how trivial, is registered, enriched and ruthlessly monetised.

This also applies to sensitive information that can compromise our privacy and harbour a high risk of misuse – such as our location data.

This is well known: Data protectionists have long warned of the risks – and the majority of users today implicitly know that they are paying for supposedly free services on the web with their data. Paradoxically, most users hardly seem to care.

Although awareness of data protection has generally increased in recent years, the notorious "nothing to hide" argument is still widespread: If you behave correctly and have nothing to hide, you have nothing to fear. The logic behind this is deceptive: as if the data of reputable users were not interesting enough to be misused.

The argument fatally misjudges the security risk and, in my view, indicates a lack of understanding of the information at stake.

Where we go is who we are

Location data is particularly sensitive. This is because it allows detailed insights into users' personal preferences and habits: Whether you exercise regularly, how often you go to the doctor, whether you frequent bars and clubs or where you spend the night. Location data is therefore considered digital gold.

The miners of the digital gold rush are the data brokers. Their business model consists of creating movement profiles from raw data and combining them with public context data such as pubs, shops or government institutions. If it is possible to categorise the places visited based on these points of interest, conclusions can be drawn about activity patterns and personal interests.

Such user profiles can be sold lucratively. Be it, as originally intended, for personalised advertising, or for abusive purposes – for example, to influence our political views, assess our creditworthiness or estimate insurance risks.

How big is the risk really

Raw location data, such as GPS tracks, are often inaccurate and do not directly reveal whether you are in a restaurant or just at the bus stop in front of it. The first app providers are starting to deliberately disguise the coordinates. At the same time, machine learning methods offer attackers new powerful tools for interpreting large amounts of data

We therefore wanted to know: How high is the actual risk that an attacker can create a meaningful behavioural profile using AI methods only in possession of my location data, even if it is masked?

Analysing attack scenarios

I analysed this question in a research paper1 using data from the social network Foursquare, where users "check in" to locations. We trained a machine learning model to recognise the category of the location (bar, doctor or sport).

Our results clearly show that it is frighteningly easy to compromise privacy in this way. Even with obfuscated coordinates, the losses are significant. Data with inaccuracies of up to 100 metres still allow behaviour patterns to be predicted ten times more accurately than mere guesswork. Only when the location errors exceed 1000 metres do the coordinates lose their usefulness.

«Even those who seemingly have nothing to hide risk that carelessly shared location data not only reveals private habits, but can also be misused for social engineering attacks.»      Nina Wiedemann

The good news is that the accuracy of the prediction decreases exponentially – by around 10 percent for every additional 8 metres of masking. This shows that although simply masking is not perfect, protection still makes sense. Some apps already offer the option of masking coordinates. Political progress is also being made: a data broker in the USA was recently banned from selling location data without the explicit consent of users.2

Nevertheless, we should not rely on regulation and goodwill. In my opinion, the key is to educate people about the risks. So responsible smartphone users cannot avoid taking care of their location data, i.e. prevent location sharing whenever possible or only sharing them selectively.3, 4

Even those who seemingly have nothing to hide run the risk that carelessly shared location data not only reveals private habits, but can also be misused for social engineering attacks. From this perspective, all of us have something to hide.