Bookmark: "Dark Patterns in Personal Data Collection: Definition, Taxonomy and Lawfulness"

Sebastian Greger


In “Dark Patterns in Personal Data Collection: Definition, Taxonomy and Lawfulness” Luiza Jarovsky dissects deceptive design patterns from a legal perspective, pointing to the fact that current privacy legislation does not properly address these intentionally misleading patterns that aim to trick users into doing things they don’t really want (she specifically defines them as nudges that are both “manipulative and malicious”, which is a very practical definition of the so-called “dark patterns”).

The centerpiece of the paper, and – in addition to a thorough walk-through of cognitive biases exploited in the privacy context in chapter IV – most interesting for evaluation purposes by the design practitioner, Jarovsky establishes a taxonomy in chapter V:

A) Pressure […] pressuring the data subject to share more (or more in-depth) than intended personal data to continue using a product or service.
B) Hinder […] delaying, hiding, or making it difficult for the data subject to adopt privacy protective actions.
C) Mislead […] using language, forms, and interface elements to mislead the data subject whilst taking privacy related actions.
D) Misrepresent […] misrepresenting facts to induce data subjects to share more or (more in-depth) personal data than intended.

The intention behind this categorization, a distillation from a broad body of pre-existing work in the field, becomes visible as the paper proceeds to discuss how deceptive patterns are incompatible with the GDPR’s requirements for consent and puts them into relation with various requirements that implicitly – though, the point of the paper, not explicitly – outlaw such practices: lawfulness of processing, privacy by design, and the fairness principle. The latter is highlighted in the summary in particular:

In a nutshell, the GDPR is silent about the exploitation of cognitive biases, manipulative interface designs and negative interferences in the decision-making process. […] To curb DP, fairness is a central concept, as it reflects the need to balance the asymmetries between controllers and data subjects. The GDPR refers to fairness multiple times, yet, has no definition thereof, either specificity or enforceability for the concept. The way to advance data protection law is by unpacking the idea of fairness, so that it can encompass the right of fair decision making and fair interface design in privacy to data subjects.

I could not agree more! The way legal departments and interface designers have since 2018 channeled their creative capacity into inventing ever more means of deception, while – at least until a court has ruled otherwise – staying “compliant” with the GDPR is mind-boggling. A more explicit definition of what is an unfair deception, in the way Jarovsky suggests (and the draft for a Deceptive Experiences to Online Users Reduction DETOUR Act in the US, as quoted in the article, seems like a good template to start with), could go a long way in enforcing fairness rather than encouraging bending the law until it breaks.

On a side note: personally (though, as a non-lawyer, I may miss some of the semantic details of this paper’s argumentative structure) I am not fully convinced by the author’s argumentation that the “consent” context is the only one where deceptive patterns affect the lawfulness of data processing:

Among them, consent is the only option that could be affected by DP, as it comprises situations in which the decision-making capacity of the data subject serves as the legitimising factor to data collection

I concur that consent flows are by far the most prominent breeding ground for “manipulative and malicious” nudges, but I just as commonly find pressuring, hindering, misleading and misrepresenting design patterns when it comes to transparency: in particular regarding the legal construct of “legitimate interest” – the GDPR demands a high, as I have analyzed before almost unachievable-in-practice, degree of transparency in data processing statements and these, too, are commonly ridden by deception and obfuscation, by lawyers and designers alike.

The practice-based taxonomy by UXP2 complements this legal taxonomy neatly, and illustrated with real-world examples:

The dark side of UX Design
With its byline “Practitioner-identified examples of stakeholder values superseding user values”, this practice-based […]

I'm Sebastian, Sociologist and Interaction Designer. This journal is mostly about bringing toge­ther social science and design for inclusive, privacy-focused, and sustainable "human-first" digital strategies. I also tend to a "digital garden" with carefully curated resources.

My occasionally sent email newsletter has all of the above, and there is of course also an RSS feed or my Mastodon/Fediverse profile.