Since 2005, World Usability Day has promoted the idea that services and products should be accessible, inclusive, and simple to use. This year on November 8, participating communities and events will focus on the theme UX Design for Good or Evil.
That may sound lofty, even melodramatic. Can user experience design really be evil?
The website Dark Patterns illustrates that, yes, UX can be dishonest and malicious. As highlighted by site creator Harry Brignull, dark patterns are features that trick users into doing something they might not want to do, but that benefits the business behind the tech.
When a service makes it difficult to unsubscribe from their mailing list, or gets you to click on something that looks like a button but is really an ad, that’s a dark pattern. See many, many examples in the website’s Hall of Shame.
A common thread in Dark Patterns is that users are misled or automatically opted in to features they haven’t clearly agreed to. Consensual Software is an open source project that advocates for explicit consent and privacy as the default.
Consensual software, the website explains, “respects users’ privacy and does not trick or coerce users into giving away permissions or data.” It’s built on the idea that tech should respect boundaries and be honest and ethical.
Danielle Leong is one of the creators of Consensual Software and the engineering manager for GitHub’s Community & Safety team. She recently tweeted,
My team asks, “How can this feature be used to hurt someone?” It’s incredibly important that we don’t just build features for the sake of disruption — we must build responsibly and making sure to think through abuse vectors and how real humans are impacted by tech.
Proactively avoiding harm is an important part of building software for good, not evil. You may create something with good intentions, but not realize how features could be used in negative and abusive ways.
For example, a social network that automatically includes users’ location in posts could put someone in danger from an ex-partner or stalker. And facial recognition technology could be used by law enforcement in ways that invade privacy and reinforce discrimination.
“Identifying Abuse Vectors” is a tutorial created by Terian Koscik to help technologists understand how software features can be used maliciously. The tutorial invites readers to identify ways their product could be exploited for evil, and walks them through strategies to prevent abuse.
Wondering how you can be more intentional about designing for good, not evil? Find an event near you via the World Usability Day website! If you’re in the Twin Cities, you can attend free World Usability Day events at the University of Minnesota and Target.