So many interesting reads and podcasts this week, it's hard to keep this list at reasonable length - here are some of the thoughts I found worth sharing these past seven days:
Rian van der Merwe's critical take on "growth hacking" as unethical practice:
Instead of evaluating features and ideas based on the concept of “growth at any cost”, let’s ask the question “growth at what cost?” to guide our decisions more. Let’s find ways to grow that respect users and their intentions, not subvert them.
Tatiana Mac, on herself as a canary in the coal mine of white surpremacy in tech:
As product creators, it is our responsibility to protect the safety of our users by stopping those that intend to or already cause them harm. Better yet, we ought to think of this before we build the platforms to prevent this in the first place.
In tech, I feel I am a canary in a coal mine. I have sung my song to warn the miners of the toxicity. My sensitivity to it is heightened, because of my existence.
But the miners look at me and tell me that my lived experience is false. It does not align with their narrative as humans. They don’t understand why I sing.
Jeffrey Zeldman on Twitter:
Disabled people aren’t the edge case. Tech people are the edge case. We represent a small portion of the population and we’re in charge of designing and building. Too often we design for people exactly like us.
Matthias Ott highlights that accessibility needs to be baked into projects intentionally, otherwise it will be ignored:
If an inaccessible site gets built, then obviously nobody thought of accessibility as being important enough to raise their voice, at no step of the process. But then again, nobody builds inaccessible websites on purpose. And that’s exactly why it is so important to never stop advocating for accessibility. Every day. Persistently and passionately. Because it still can work if we keep making noise.
A research paper (paywalled) by Nora A Draper and Joseph Turow introduces the concept of "Digital Resignation":
One of the puzzles in the face of pessimistic views of social dynamics is to figure out ways to disrupt the rhetorical strategies, tactics, and technical tools that industries use to atomize groups and convince the public about the inevitability of data use while they remain reassuring regarding the consequences of these practices. [...] Failure to recognize digital resignation, therefore, allows to go uncontested arguments that people are willingly and knowingly consenting to take part in technical systems that harvest their personal information.
On the CBC's Spark podcast, Lauren Cagle elaborates of the ethics of what she calls "strangershots", the practice of posting photographs of other people online:
Surveillance isn't just something that happens in a sort of top-down creepy Big Brother kind of way. It's also something we do in everyday life.
Every time that we engage in this behaviour, whether we're the ones taking the strangershot and posting it online, or whether we're the person commenting, liking, sharing or retweeting, what we're doing is we're classifying new norms around what we should do — without taking a moment to step back and question if this behavior is leading to the kind of world that we want to live in.