Bookmark: "Designing accessible web with privacy - when web browsing reveals information"

Sebastian Greger

Bookmarked:

In “Designing accessible web with privacy - when web browsing reveals information” Lukasz Olejnik (in yet another spot-on analysis of privacy threats on the web) presents how the current efforts to improve accessibility features in web browsers may at the same time make the users of assistive technology more susceptible to profiling.

Most importantly, it will simplify information access for disadvantaged people. However, the design rationale behind this piece of technology are both challenging and unique. The choices made now will define for long how people with disabilities will be using the web. These choices and decisions will impact on the handicap disclosure process when visiting sites. I admittedly focus on the privacy aspect, but I stress that there is an important digital ethics case here.

This is a major concern. While, as Oljenik points out, the Accessibility Object Model (AOM) and the related APIs are extremely important to ensure equal access to the web, the architecture of the web makes it difficult to shield these features from misuse.

For the further development of the browser APIs, neither of the outlined scenarios feel right: Either (exposing the features openly) they can be used to identify individuals using assistive technology by default, or we run into the common “consent” UX of asking for users’ permission without them being able to completely understand the far-reaching implications.

Plus, I want to add, in case of the latter scenario, where browsers would ask before exposing their use of assistive technology to a website, we might end up with a two-layered opt-in: After the browser has done its thing, the website itself may (depending on the use case) also have to ask for permission for processing that information - as, constituting “data concerning health”, it may easily fall under the “category of special data” where consent is the only legal basis (Art. 9 GDPR). Two consent pop-ups, aimed at users who already may use complex solutions to access the web - this sounds like a designer’s nightmare.

The blog post does not propose a solution (thought it suggests that one potential solution space could lie in exploring possible mappings of the accessibility layer before exposing the interactions). It acknowledges and comments on an ongoing debate around the AOM, and stresses that the described dilemma is an example for an easily overseen trade-off in technology:

The security and privacy trade-off is perhaps the one known best. But there is more to impact assessments, so often it’s actually security vs privacy vs accessibility. When rotten compromises need to be, or are made this does not always lead to best designs. The case described in this post is especially challenging. It will touch many users. It concerns sensitive information. And yet, good choices need to be made.

So, while the trade-off between security and privacy is commonly discussed (just to name an example: logging IP addresses in order to counter DOS attacks), there appears to be another: efforts to design for privacy and for accessibility may clash as well. It is yet another complex aspect to be taken into account as we create ever more complex technology.

I'm Sebastian, Sociologist and Interaction Designer. This journal is mostly about bringing toge­ther social science and design for inclusive, privacy-focused, and sustainable "human-first" digital strategies. I also tend to a "digital garden" with carefully curated resources.

My occasionally sent email newsletter has all of the above, and there is of course also an RSS feed or my Mastodon/Fediverse profile.