When Trading Track Records Means Less Privacy www.eff.org/deeplinks/2018/01/when-trading-track-records-means-less-privacy
This analysis of the recent Strava incident (where a global heatmap of GPS workout tracks revealed plenty of potentially secret military sides worldwide) by the EFF is a descriptive explanation of why “anonymization” is not the silver bullet when it comes to dealing with personal data:
Though the revealed information itself was anonymized—meaning map viewers could not easily determine identities of Strava customers with the map alone—when read collectively, the information resulted in a serious breach of privacy.
The blog post broadens the discussion to include non-military users, showing how “social features” of Strava have revealed personal information before: as technology develops, and algorithmic de-anonymisation becomes increasingly easy, the idea that removing personal identifiers from a piece of data makes it “anonymous” is often a false belief.
Often, our understanding of “anonymous” is wrong—invasive database cross-referencing can reveal all sorts of private information, dispelling any efforts at meaningful online anonymity.
This is why, for example when “preparing for the GDPR”, it is so important to understand that just anonymising data does not mean it is no longer personal data and that more often than not it needs to be treated with similar care as data that carries individual identifiers (and this applies even more so to “pseudonymised” data—an important consideration too easily forgotten when “just tracking our users based on a randomly assigned ID”).