This year’s World Usability Day featured “Design for Good or Evil”, and I got the opportunity to attend the Hamburg edition of this always-inspiring global community event. With a topic so close to my heart, it was great to attend a range of sessions surrounding design ethics and meet interesting people.
Since 2005, the second Thursday in November is World Usability Day (WUD); established by the User Experience Professionals Association, it promotes usability and user-centered design. Organized by local chapters (the German UPA in Germany; WUD Hamburg in a collaboration led by Eparo and the Hamburg University of Applied Sciences HAW), this day sees over 200 events worldwide, always under one common topic.
Under the 2018 theme, “Design for Good or Evil”, Rolf Schulte Strathaus and team had crafted an appealing agenda. As always with dual-track events, I had choices to make; ultimately sticking to the talks in the main auditorium that followed a red thread of ethical design. That said, I unfortunately had to miss the talks on “Fast Fashion by Circular Design” by Manuela Risch (video, sketchnote) and a praised talk on product design with neuro-psychological principles by Philipp Spreer (video) that I would have loved hearing as well, due to urgent work commitments.
An intro to ethical design
Not formally designated as such, the first talk by Henning Fritzenwalder (video; sketchnotes) ended up being something like a thematic opening keynote for the day - maybe best illustrated by this quote:
UX means to know and protect the limitations and vulnerabilities of the human being
Outlining a general approach to questions of ethics in a design context, the presentation set the tone for the day in many ways: maybe most importantly, the speaker illustrated (through audience participation) that “right” and “wrong” are not binary - that the values every designer follows can be different and that ethics are all about discourse. Why does everybody agree that stealing is bad, but when asked about occasionally copying a piece of music that front starts to crumble?
Technology development is so fast that people cannot keep up with it.
Some technological solutions do not meet people’s needs.
Designers need means to detect problematic solutions and replace them.
Henning highlighted how technology can shape social norms - from the widespread assumption of “what is possible must be legal” one may daily witness on YouTube, to Mark Zuckerberg’s infamous “privacy is no longer a social norm” or the transformation of a self-assessment technology like Fitbit into a surveillance tool for health insurances.
The talk even made the business case for design ethics, stating how unethical design is risky for companies: legal risks, loss of trust, limited ability to innovate.
Henning presented a three-leveled approach to putting this into practice: “moral sensibility”, “moral creativity” (the ability to rethink human-friendly technology conceptually), and - this is the most interesting one from my point of view - “moral representation” (the reframing of “usability” as something beyond the experience of individuals).
Don’t let technology do something you wouldn’t do yourself
Use KPIs that don’t just maximize for engagement and online time - seek deep metrics
Look at who gains and who loses what
Respect your user, their values and goals (be transparent with yours)
Building on Harry Brignull’s continuum from honest interfaces to dark patterns, Jonas first defined some terminology, implicitly extending the understanding of ethics as a non-binary concept that had already surfaced in the first talk. While truly honest design may - in its most pure state - put even the interest of the designer behind in order to serve the user, there is an undefined line between honest design and persuasive design where things become questionable. Needless to say, right after persuasive design begins the territory of evil, manipulative, dark design.
The presentation picked up the idea that the (potentially manipulative) power of the designer lies in exploiting human shortcomings; also established as, for example, the seven deadly sins in the bible. Jonas presented these based on Chris Nodder’s work on Evil Design, and illustrated them with real-world examples from the web. In an attempt to summarize the talk’s numerous and often surprising examples:
Using social proof to trigger people’s desire for pride (e.g. trust logos and testimonials)
Utilizing human’s known tendency for sloth by making desired behaviour easy to achieve (e.g. exploiting the f-pattern of human reading)
Trigger behaviour through gifts and risk reduction to cater to lust (e.g. free return policies, special offers)
Deal with anger by using humour or offering means to release steam (e.g. friendly error messages)
Exploit envy by designing for status (e.g. “Pro” statuses that barely mean more than a badge in the UI)
Make use of tendency towards greed by providing means to get ever more (e.g. UIs that motivate to achieve “next levels” etc.)
Trigger gluttony by making it easy to keep asking for more (e.g. pull-to-refresh UI patterns)
Doing design in an ethical way (and staying away from evil design), Jonas summarized, means to find the sweet spot between honesty and persuasion. As the numerous examples had shown, that is not always a straightforward task - it is easy to cross that line, in particular when design decisions are not only driven by serving human needs, but also business interests.
The talk ended with suggestions for staying on the good side:
Obey established norms, such as web standards or accessibility guidelines, as these already have honesty and absence of misleading patterns codified.
Validate designs from an ethical perspective, i.e. user testing should also cover the ethical experience; in absence of testing opportunities, also an internal heuristic such as “would I dare to let my parents use this” can be valuable.
I believe that this combination of following good practice (to be honest, most designers have a rather good inner compass for where they are exploiting the human psyche - though not everybody understands that this is unethical) and constant “ethical reviews” are a major step towards more ethical conduct in design.
Is it ok to trick users?
The day’s podium discussion, facilitated by Matthias Müller-Prove (who assembled a wide range of background material on his website), hovered around the question whether it is “ethically/morally ok to trick users”. Summarizing the entire panel (video; there is another sketchnote by Ania Groß) would be beyond the scope of this summary, but there were a few points that I took note of:
An interesting exchange unfolded after the moderator highlighted how designers may use metaphors to create an illusion for users. His example of the recycling bin on a PC desktop brought to my mind a related argument by philosopher Rainer Mühlhoff at the Privacy Week Berlin a few weeks earlier (video, summary): by simplifying technology in the interest of UX, there is a risk of disfranchising users from taking full control. Rainer Sax, whose arguments I particularly enjoyed throughout the panel and who had already established that fooling users is not acceptable, however qualified that use of a metaphor per se does not equal lying - the problems start when users are misled on purpose. Linking back to the earlier two talks, I believe this was a brilliant example for “finding the line” - even a specific means (such as a metaphor) is not good or evil in itself, but requires careful evaluation.
Rainer made another (related) point later: once a designer becomes aware of a questionable ethical consequence of their design, not acting to correct that turns it into a conscious design decision at the designer’s responsibility. I couldn’t help but immediately think of Cade Diehm’s work on “weaponized design” and the related question of how to prevent undesired outcomes of design.
Stefan Nitzsche extended on the metaphor question by stating that, essentially, every information is manipulation; even “thank you” or “please” can be used in a manipulative way. He highlighted how the most important aspect is that users understand why something takes place. This connects well to an earlier statement by Joachim Tillessen, who pointed out that, rather than focussing on debates over obvious “superdark” patterns, it is more important to ensure transparency, completeness and the weighing of interests in design. Transparency has long been at the focus of my own work on privacy design, and this was just one of several instances in the day’s debates where transparency surfaced as a key contributor to ethical design (to me at least, but I may indeed by slightly biased).
Sandra Gärtner, who had already described some “dark patterns” she sees in market research (intrusive survey pop-ups, intentionally maldesigned questionnaires to yield desired results, connection of user click paths with questionnaire responses), made a passionate statement on the limitations of analytics. She expressed how assigning analytics data with too much meaning ignores the fact that for example users who delete cookies falsify the data. She also sees an ethical issue in the blind collection of any data available, no matter its usefulness (as is common practice in web analytics) and furthermore posed the question whether focusing on KPIs such as click rates or time spent on site are not intrinsically encouraging unethical goals (such as stealing users’ time instead of serving them as quick as possible). I found myself nodding to almost every single statement she made, not least when later expressing her wish that testing/research should start much earlier in the process.
The closing statements by the panelists are worth noting, as they were intended as triggers to help designers to find their ethical position:
Rainer Sax: Let us try to be good people (for me, the spirit of this statement connects directly to Alan Cooper’s recent work on ancestor thinking)
Sandra Gärtner: one should always apply one’s own ethical principle in one’s work (NB. this was also the first principle in Henning Fritzenwalder’s talk)
Joachim Tillessen: The question of “what is the true interest of the user” challenges us as designers to take position
Stefan Nitzsche: There is a conflict between two views of humankind: is it the designer’s responsibility to protect people from themselves or is it up to every adult themselves to assess their decisions? (I shall add that this is a much-debated question not only in design, but also in the privacy field.)
Privacy as UX?
Having been invited to present on “Privacy as UX” (an almost complete rewrite of my much more web-specific talk at the Webkongress Erlangen in September) and assigned the closing slot at 4:30pm, I felt the urge to use that opportunity to tie together some of the debates from the day - hence, I sprinkled some of the above-mentioned observations into my prepared talk.
In my presentation (video, slides and references), I focused on establishing how people experience the use of their data emotionally, how understanding data processing represents a cognitive load and that users need the right tools to express their preferences. As a means to approach these highly complex questions, I suggested three strategies:
Data consciousness: deliberately not purely limited to “data minimalism” (which, as I illustrated, remains a key strategy nonetheless), I see a need for every data point used in a technology to undergo scrutiny for its value vs. the “cost” (e.g. in terms of emotional/cognitive load, or execution of their control) it incurs to the user.
Transparency: connecting this part to the earlier presentations of the day, I used my ongoing explorations on the GDPR’s transparency requirements to outline how important transparency is to ensure ethical conduct, but how at the same time it brings enormous designerly challenges.
Processes: since gluing privacy concerns on top of a ready product will at best yield poor results (often such efforts inevitably fail on that), privacy and data protection should be understood not just as questions of legal compliance or security engineering, but as core aspect of every design process. It should be part of all UX/CX, usability, user research/testing, co-design and accessibility efforts.
My summarized conclusion was that thinking of privacy as part of the user experience manifests itself in three ways:
as a possible solution to contemporary design challenges,
as the responsibility of designers for not creating unethical technologies,
I greatly enjoyed the discussions after the talk, unfolding both in encouraging feedback and constructive suggestions, and hope to have been able to share some of the feeling of closure I felt myself in injecting an inspiring day’s input into the rather specific (and from what I overheard, to some even repellent) topic of “privacy”.
Summary: An inspiring, enjoyable day
As I hopped on a bikeshare bicycle to let the wind blow through my freshly-filled brain on the way to Hamburg’s central station, I looked back at what I experienced as a thought-provoking and pleasant day in Hamburg. I met a bunch of friendly and inspiring people and following the presentations and discussions helped me to once again add some new thoughts to my ongoing work around the practice of ethical design.
Thanks and praise go to the brilliant speakers I had the joy of listening to, and to the dedicated team of organisers, who cannot be praised enough for running such a smooth event with 700 participants (no registration required, so with lots of wildcards in that) with maybe a dozen of staff. Bonus points for contracting a caterer that provides opportunities to young people who struggled with the educational system!
- All paraphrased by me, as the talk was in German ↩
- I have seen these before, but was not able to locate a canonical source (e.g. a ThoughWorks whitepaper or the like) so far. ↩
- This is one of the key points of the Agile Manifesto ↩
- As for example a deep understanding of users is almost a precondition to fulfil current legal compliance requirements, putting designers in a much more crucial role in many processes. ↩
- The “GDPR hangover” from May seems to still do its damage among the design community. I’ll stick to my mission to change that. ↩