Talk: "Designing away the cookie disclaimer"

Sebastian Greger

This is the transcript of my lightning talk from the beyond tellerrand Berlin pre-conference warm-up on 6 November 2017. It was a condensed version of my longer, work-in-progress and upcoming talk on privacy as a core pillar of ethical UX design. If you are interested in the final talk or know about a conference or event that might be, I’d be thrilled to hear from you.

NB. This talk is from the early stages of my ongoing work on “privacy design”; more recent posts can be found on the blogFirst of all: Thank you Marc and Malte for the invitation to present here today. This is my second beyond tellerrand and I’m excited to be here.

drawing: a mobile phone on which the content is barely visible under a cookie consent banner

I want to talk about the single most annoying UI element on the web today – for users and designers alike: the “cookie banner”.

same drawing as before, but with the "Clippy" character known from old MS Office products

No matter where you go on the web: like this old friend, it’s always there for you:

“Hello, you just arrived at our carefully crafted website, but before we let you read what you are here for, we’d like to let you know that we’re tracking you. Oh, and just in case you were wondering: yes, we obviously already started tracking you, this pop-up is mainly here because we heard we need to put it in”.

Like Clippy, you can always click it away, but next time it’s back. No mercy. This is the initial user experience moment on today’s websites…

drawing: a pop-up with text 'we care sooo much about you' and an 'ok' button

The working title of this talk was “Designing away the cookie disclaimer”. But I figured that would imply I have the definitive answer to this challenge. Which I don’t. So I added a question mark at the end. Feels better now.


I am aware that the upcoming changes to privacy regulations are a rather controversial subject. While some lobbyists already condemn any tightening of EU-wide privacy regulations as dooms day for the web as we know it and for European economy as a whole, others argue that only the strictest possible privacy law protects human rights on the internet.

If you ask me, this is first and foremost about empowering everybody – and non-technology audiences in particular – to understand and take control over what happens with their personal data. And yes, that may hurt a little given how used a whole industry has gotten to the convenience of violating those rights.


Also, I am not a legal expert. Nor am I a clairvoyant, able to anticipate what direction the final ePrivacy negotiations in Brussels will take. Hence, I am neither here to give legal advice, nor to predict the future. I work as a freelance user researcher, strategist and interaction designer, so my focus today is on ethical design choices and the UX of privacy.

I call it the “user privacy experience”.


So – how did we end up in this situation where every single website is informing you that you are being tracked? Well, the roots are actually not as recent as it may appear: this is based on the EU’s so-called “Cookie Directive” (2009/136/EC) from 2009 that requires consent for collecting data, yet every member state interprets that slightly differently in their national law.

drawing: a sheet of paper with the title "2009/136/EC" and hashtag "#consent"

Some assume their citizens to be so tech-savvy that an “implicit opt-in” by not activating the Do Not Track flag in your browser is sufficient; others strictly require explicit information before tracking a user.

Germany is an interesting case in this context, as the government decided that existing law was already in compliance with the EU directive; despite federal data protection authorities disagreeing. So the current situation is that of legal insecurity – with no official guidance as to what constitutes a truly compliant mechanism of “consent”.

What is most upsetting about these current “cookie banners”, is that they often do not serve any purpose for the user, who however is the only stakeholder constantly exposed to them. Few implementations truly give the user an easy way to not being tracked, and most commonly the notification element simply states that continuing to use the site implies consent to tracking – often even after anything between one and two dozen cookies have already been set.

drawing of the "we care sooo much about you" pop-up with a shrugging emoticon below

This way, many of the current solutions provide a bad experience for pretty much everybody: privacy-conscious users are annoyed by being presented with a choice that is not really a choice, and those not knowing the risks of tracking are simply annoyed by having to click away yet another one of those pop-ups.

And they do not even necessarily stand on a solid legal base! Cookie banners are a symptom, not the cause, of bad UX.


As of May 2018, a new ePrivacy Regulation is likely to come into effect along with the GDPR regulation. Its details are currently still under negotiation. What we can expect in regards to user interaction in the UI is four-fold:

  • consent needs to be “free”; in other words, a user cannot be punished for not giving consent

  • all consent obtained needs to be “specific”; this means selective permission for each tracker or category of trackers

  • a requirement for that consent to be “informed”; this means no more of that “we store cookies to enhance your user experience” nonsense you can read everywhere today, but detailed explanations of what data is collected by whom, and for what purpose

  • a requirement for unambiguity; no pre-ticked checkboxes, no “implicitly assumed consent” – explicit means the user has to consciously do something to permit tracking

four hand-written words: "free, specific, informed, explicit"

The “privacy-by-design” and “privacy-by-default” requirements of the GDPR make the default case that of outmost privacy. “Do Not Track” is likely going to be the future default in any browser in the EU, unless the user changed it. And sites will have to respect that setting, under the threat of pretty steep fines.

a hand-drawn checkbox with "do not track" selected

This basically challenges the very foundations of many current practices. If you have to explain every tracking application to the user in all its juicy details, you probably can guess how recent research points at anticipated consent rates of 20% and less.

In a nutshell: informed, explicit consent for most forms of tracking will likely be inevitable, hence consent dialogues in the UI won’t go away. The question is: how can we still provide a great user experience, and a great privacy experience at the same time? And maybe get rid of that pointless default pop-up in the process?

hand-drawn diagram, with a "consent" circle within a bigger "UX" circle

In order to find smarter solutions, the first step is an audit of all potentially tracking elements of a website, concept, or business model:

  • what data is being collected, including by third parties?

  • does that require consent?

  • if yes, do we really need it; does the gain justify the burden of asking the user?

three hand-written bullet points: what? consent? importance?

Not unlike the idea of “progressive enhancement”, you could then aim for the smallest common denominator: a site that serves your goals while fully accessible for a user with the most strict Do Not Track settings enabled. That puts you on the safe side – and should make any consent pop-ups obsolete.

Then consider what actions you need consent for and could it be obtained in a more classy way than with that dreaded cookie banner on arrival.

hand-drawn pyramid graph consisting of two levels: at the base "100% DNT-compatible", and on top: "+extras"

Let’s look at the two most difficult cases first.

First up, there are ads. If the business model of your site relies on third-party ads with behavioural tracking, you are pretty much out of luck in regards to getting rid of the legal pop-up. Quite the opposite, you likely have to be prepared to extend them to be much more specific.

There are alternatives though: Once we accept that the idea of creating opaque profiles of unsuspecting web users to make money is unethical to start with, we can clear our minds to start working out monetisation and advertising models that respect human beings’ privacy. Despite all the prophecy of doom: there will be ways for ethical advertising to thrive, even under a most strict ePrivacy directive.


The other main application for tracking is web analytics. This too is tricky, but does not quite require an entire industry to question their foundations. Instead, we have to give up on two presumptions that are surprisingly rarely challenged:

  • that Google Analytics is the only reasonable analytics platform, and

  • that every website has to collect as much data as possible.

Google has made it so easy to include their analytics beacon into a website, that today half the web and more is being monitored by them - from multinational corporations to the small crochet blog of your aunt Lily. It’s free, it’s copy-paste, it shows how many users you have and where they come from. Yet, “return visits” or “demographics” are already beyond what many websites really need to track. And if the needs in a commercial context are justifiably more complex, then Google Analytics is considered the elephant in the room, few questions asked.

On the other hand, the philosophy of a minimum actionable dataset (a term coined by Chris Stacy; in German “Datensparsamkeit” is a related concept) suggests you should never collect data that you don’t have a clear purpose for. This saves you a lot of headaches: less management hassles, less contingency planning – and, ideally, no need for consent.


There is a broad range of other analytics tools out there. The great benefit of solutions like, for example, Piwik is that these can be run as a first-party service, eliminating third-party tracking headaches. Once in control of the software, it is really easy to set up analytics that anonymize IP addresses by default, do not store cookies unless needed, and so on.

Another powerful way to obtain analytics is the analysis of server logs – perfectly applicable for many use cases, despite its poor reputation.

But let’s be clear – third-party ads and analytics are the biggest challenges if we want to get rid of privacy consent modals – aka “cookie banners”. And only the final ePrivacy directive will allow us to assess where is the limit of what can be done without consent.

With those two behind, there are other aspects that are easier to tackle.

Loading files and media from third party sources is something we can get creative with. If the providers of these resources apply tracking, you need consent for exposing your user to them. Depending on the use case, you can either work around this by not using them at all or by asking for consent only when the user interacts with such feature.

The EFF’s latest guidance on Do Not Track proves to be valuable here – the idea being that by default no resources shall be loaded from third parties unless they commit to obey DNT. This applies to embedding media content as well as to loading script and font files from CDNs: if the provider has not made an explicit commitment to respect DNT, it’s off-limits. This is an interesting approach because browsers’ Do Not Track settings may play a significant role in future consent rules under the ePrivacy rules.

hand-drawn visualisation, with a stopwatch shwing 1.2 milliseconds on the left and a curtain with label "privacy" on the right

Sounds tough at first, but doesn’t this totally make sense? Why would you expose your users to the mental load of a consent pop-up just to ensure they are ok with being tracked by a third party when you could as well work around that? Do a few milliseconds in loading a font file from a CDN really weigh more in regards to “user experience” than the dreadful experience of the user being stopped by a complicated legal consent form and their feeling of uncertainty of having to consent to some abstract technical contract? Yet another overwhelming “terms and conditions” they won’t read?

And for those bits and bytes that cannot be loaded from elsewhere, ask for consent in the context where it applies.

two hand-drawn mobile phone screens: on the left the conventional cookie banner, on the right a consent dialogue embedded in the context

When embedding a video from a streaming service, only show a still instead of the iframe, and explain the implications to the user in the context. No global consent in the footer of your front page needed.

Same goes for social media sharing buttons: implement solutions that only show placeholders until the user loads the original while acknowledging the risk of being tracked. Again: No global consent in the footer of your front page needed.

This list could go on and on. But my time is almost up. And these are just examples to trigger some ideas. My point is to show that evaluating what we are asking consent for can help to mould these elements into a more consistent user experience than is the default disclaimer that simply says “we use cookies and you’l have to be fine with it”.

To conclude: As we may wonder how to improve user experience by “designing away” the cookie banner, I believe the most important lesson is to understand that this means changing the way we think about the building blocks of our websites.


When every individual’s human rights as an internet user are enforced by even stricter EU-wide regulations in the near future, we have to embrace the thought that this is about adapting an entirely new mindset. A lot of current practices on the web are intrinsically based on disrespect for people’s most personal data. By reflecting on how to avoid the need for consent in the first place, we can become better designers for more humane technology – and improve the experience of using it.

a hand-drawn graph, where the terms "privacy experience" and "user experience" merge into "user privacy experience"

I want the “privacy experience” of every citizen to be an elementary part of what is commonly understood as “user experience”. So, ultimately, this is a call for “user privacy experience” to become another elementary pillar of UX:

  • ask yourself where you are putting the user’s privacy at risk

  • ask yourself can the same goal be reached with a smaller or no privacy impact

  • ask yourself what is the optimal way to ask for informed, specific and explicit content (if necessary at all)

and last but not least:

  • ask yourself how the user experiences the way your site is dealing with their privacy

Don’t ask yourself “does what we are doing require a cookie banner”, but instead do something that respects the spirit of these privacy rules in the first place.

Just making cookie disclaimer pop-ups more complex to comply with a new law is the worst possible reaction in terms of user experience design. Why not use requirements such as “informed consent” as yet another design driver: giving the user the feeling they are in control of their privacy again.

Photo by Joschi Kuphal.

I’m developing these and related thoughts on my blog, so if you are interested in these topics, feel free to follow me online – or even better: chat me up one of these days here at the conference! I’m looking forward to work together to make the web a more humane place again.

Thank you.