We were told not to make users think. We succeeded.
Unintended consequences are practically a given when you create something new. You find a need, develop a tool that meets those needs, test it and push it live while hoping for the best. If a problem arises with a product, an unintended consequence you didn’t consider, you are prepared for the possibility that you may need to take it down and fix it. However, when technology provokes a societal change with unintended consequences, it’s not as easy to fix. Possible, but not as easy.
We are in the throes of a growing demonisation of social media, particularly Facebook (I can’t pretend that I don’t hope for its declining influence every day). But I doubt any of the GeoCities generation who originated social media genuinely thought it would come to this. We were excited about the democratisation of opinion and information. We were enthralled by the ability to create networks of people we knew from the past while forging new ones with people we hoped would be a part of our futures; people we had shared interests and values with from around the world. If we knew that algorithms would be purposefully designed to reward hatred and exploit vulnerability, would we have taken the same path? Maybe we wouldn’t have believed the warning.
Humans thinking like machines
At the beginning of 2021 I was listening to a Ways to Change the World podcast interview with Timothy Snyder , an author and historian focusing on the topic of Tyranny. He made some interesting points about the fact that humans are now being pushed into thinking like machines, which doesn’t sit well with the way we are supposed to relate to one another. In essence, we are communicating via interfaces that force unnatural and destructive behaviors that somehow miss their mark. If this is true, we — the digital architects of online experiences — all have the potential to be a malevolent force. It’s easy to blame Mark Zuckerberg and other tech billionaires because who could feel empathy for the highly privileged people who profit so enormously and purposefully from their (often detrimental) control over us? But Zuckerberg didn’t start the fire and we have been the ones who fuel it.
The role of UX
So, the spirit of self-reflection brought me to the role of UX. In 2005, Steve Krug wrote Don’t Make Me Think!: A Common Sense Approach to Web Usability . Although Jacob Nielsen is often seen as the king of usability, surely Steve Krug represents the spirit of it. Not forcing users to have to work hard and to make experiences as intuitive and efficient as possible became what all designers strove to do. As a consequence, no user has the patience to navigate complex experiences anymore. None of us want to make things harder than they should be, but an open question I ask myself is ‘how has this simplification of interactions online impacted people’s ability to appreciate the complexity of real interactions with other humans?’ It has certainly made people less patient when queuing offline, but what about people having the patience to learn about something complex, like a different perspective?
The missing tools of understanding
Certainly, when we talk about the commercialisation of hatred and ‘cancel culture’, the empathetic tools of context and intent are clearly not being used. These are the ‘offline’ tools that humans have always used to understand each other, analysing whether someone intends to harm and the role context plays in the motive and meaning of their words at the time and place at which they are said. If someone tweeted something flippant and ignorant in 2012, does it really mean their benevolent actions in 2021 don’t count? Sometimes yes, but more often no.
What about the intersectionality of people? Demographics are a demonstrably inhuman way to judge one another. Using gender, age, ethnicity, nationality, socio-economic groups etc to determine how people should behave or be spoken to doesn’t work in an equitable community. In real life, humans are a combination of demographic and personality intersections that even the smartest data scientists and martech platforms struggle to properly respond to technologically. People who are truly interested in understanding others take on the complexity of that intersectionality, are interested in intent and are prepared to take context into account when judging words and actions.
Valuing complexity and learning
So, how do we make our experiences more capable of allowing for human complexity instead of trying to actively simplify the audience we are designing for, pushing them into binary thinking and a dislike of the unfamiliar? It starts with accepting that simplified information is not knowledge and knowledge is not necessarily wisdom…and we could certainly do with all being a little bit wiser right now. While simplifying online experiences is still crucial, what if — instead of exclusively not making people think when they navigate their online world — we also made it easier for people to think by accomodating their complexity?
I’m betting that even that slight change in intent might just move us all a little further in the right direction.