We were told not to make users think. We succeeded.

Tracy Brown
4 min readNov 6, 2023

We appear to be in the throes of an epidemic of online and offline hate, and when we follow the breadcrumbs it repeatedly leads us to social media and malevolent algorithms. The warnings are becoming more difficult to ignore. As of November 2023 the UN Human Rights Chief Volker Türk felt the need to issue a stark warning about the sharp rise in hatred globally, currently provoked by the conflict in Gaza but massively fuelled by online hatred and the echo chambers of misinformation and wilful ignorance.

I can attest to the fact that the GeoCities generation who originated social media never genuinely imagined it would come to this. We were excited about the democratisation of opinion and information. We were enthralled by the ability to create networks of people we knew from the past while forging new ones with people we hoped would be a part of our futures; people who had shared interests and values from around the world. If we knew that algorithms would be purposefully designed to reward hatred and exploit vulnerability, would we have taken the same path? Maybe we wouldn’t have believed the warning, particularly as there are always unintended consequences when you create something new and trying to pre-empt them is surely the path to madness. We can test new products and services, but we can’t meaningfully test the multiple potential societal outcomes of every new idea. At some point, you just have to learn by doing.

So, WTF happened?

At the beginning of 2021 I was listening to a Ways to Change the World podcast interview with Timothy Snyder, an author and historian focusing on the topic of tyranny. He made some interesting points about the fact that humans are now being pushed into thinking like machines, which doesn’t sit well with the way we are supposed to relate to one another. In essence, we are communicating via interfaces that force unnatural and destructive behaviours that somehow miss their mark.

I’ve certainly noticed that the empathetic filters of context and intent are clearly not being used as much as they used to be. These are the ‘offline’ tools that humans have always used to understand each other, analysing whether someone intends to harm and the role context plays in the motive and meaning of their words at the time and place at which they are said. If someone tweeted something flippant and ignorant in 2012, does it really mean their benevolent actions in 2023 don’t count? Sometimes yes, but more often no.

What about the intersectionality of people? Demographics are a demonstrably inhuman way to judge one another. Using gender, age, ethnicity, nationality, socio-economic groups etc to determine how people should behave or be spoken to doesn’t work in an equitable community. In real life, humans are a combination of demographic and personality intersections combined with a lifetime of experiences that even the smartest data scientists and martech platforms struggle to properly respond to technologically. People who are truly interested in understanding others take on the complexity of that intersectionality, are interested in intent and are prepared to take context into account when judging words and actions.

The role of UX

If what Timothy Snyder believes is true, we — the digital architects of online experiences — also have the potential to be a malevolent force. It’s easy to blame algorythms and AI and tech billionaires because who wouldn’t dislike entities that can profit so enormously and purposefully from their (often detrimental) control over us? But in the spirit of self-reflection, we must think about the role of UX. In 2005, Steve Krug wrote Don’t Make Me Think : A Common Sense Approach to Web Usability . Although Jacob Nielsen is often seen as the king of usability, surely Steve Krug represents the spirit of it. Not forcing users to have to work hard and to make experiences as intuitive and efficient as possible became what all designers strove to do. As a consequence, no user has the patience to navigate complex experiences anymore.

None of us want to make things harder than they should be, but a question I have been asking myself is this; how has this simplification of interactions online impacted people’s ability to navigate the complexity of real life? It has certainly made people less patient when queuing offline, but what about people having the patience to learn about something complex, like the many real life scenarios in which different perspectives can be equally true? How has it lead people to believe that they can have the answer to a geopolitical tragedy founded in millennia of historical complexity by simply viewing a couple of reels?

WTF now?

The path ahead involves the painful withdrawal from our addiction to over-simplification. That doesn’t mean making products and services harder to use; just kinder, more balanced and less exploitative ones. It means that — as designers — we have to recogise that our sole objective cannot be to stop people from thinking, or to inspire behaviours that are a detriment to their cognition, or we will surely continue to reap what we sow. We have been creating voracious and unthinking online appetites, but at what cost to them; at what cost to all of us?

The solution lies in spending even more time on our critical and systems thinking skills if we are to create beneficial experiences that positively connect diverse users who have complex needs and feelings and thoughts. We are going to have to think of cause and effect beyond hits and clicks and likes. In a world in which algorithms are slowly devouring our humanity, we cannot simply watch it happen when we have the skills to stop it.

--

--

Tracy Brown

Experience strategist and author, using insights about human behaviour to fix broken experiences for customers and employees.