Dark UX That'll Make You Rethink Your Internet Habits

Dark UX patterns are an invasive species. They’ve been introduced into our internet experience quietly, with no fanfare. Their goal is to stealth their way through websites, tricking you into agreeing to things you didn’t want to agree to or buy things you didn’t want to buy. 

If you’ve ever noticed a dark pattern, you’ve probably met it with mild annoyance. An acknowledgment of its nefarious deeds. A note-to-self to rant about this to a friend that was promptly forgotten. And then, likely, you did what it wanted you to do anyway, because, sometimes, it’s easier. 

There’s no shame in it, though. We’ve all been there. Ever since the new GDPR rules were introduced, we’ve all been faced with a complicated set of cookie options we couldn’t be bothered to deal with. Click the brightest button and move on. I’ve got an article about Zeppelins to read

So why does it matter? It doesn’t have any immediate effects on our internetting experience. We don’t face any extra charges, most of the time, and who cares if the company I’ll never visit again has access to my browsing data?

Let’s start with figuring out what dark patterns are.

Say you’re on a website, and the usual cookies banner pops up. It gives you two options; ‘accept’ and ‘show purposes’. This leaves you with the immediate assumption that the only choice you have is to accept.


That’s not the case, is it? GDPR law means every website has to give you the option of controlling what data you give away, so you dig a little deeper. 
The next choices you’re given are as follows: confirm my choices, or allow all. How confusing is that? Below is a quick replica of the example I’m talking about, with all the necessary identifiers removed. 

As you can see, the use of greyed out boxes immediately gives you the sense of what you can (and more importantly can’t) click on. This is a classic example of a dark UX pattern. Two, in fact. One called ‘trick questions’, for its goal of tricking you into giving answers you didn’t want to give. And the other called ‘privacy Zuckering’, lovingly named after a particular social media giant who made it into the news for eerily similar tactics.

And, the thing is, it works. When we’re in a rush, or falling down a research rabbit hole, we don’t tend to pay attention to what we’re clicking on. Even if we do notice it, most of the time we don’t even get angry with the increasingly complex design. It’s annoying in the moment, but when faced with it on most websites you visit, you become blind to the tactics. 

Like I said, I’ve got an article on Zeppelins to read.

There’s an important distinction to be drawn, though. Dark UX patterns are intentional. They’re a Trojan horse planted in your online city of Troy. They’re a sinister force hiding in the shadows, whispering in your ear. They’re design choices made to manipulate your time on the internet.

As design agency Nexer (formerly Sigma) says, “dark patterns are not at all poor design by mere negligence. They are intended to persuade and dissuade customers in ways that benefit the brand rather than the user.

Other Dark Pattern Tactics

The Roach Motel

Have you ever subscribed for a free trial of a popular video streaming service, realised you’re not particularly interested in it’s limited selection and confusing UX (not dark, but frustrating nevertheless), only to find yourself trapped in an infinite loop of clicking link after link? It just seems like the ‘deactivate’ button is never where it needs to be. 

Users of Amazon Prime in 2018, for instance, reported having to follow an incredibly unintuitive link path, only to end their odyssey with a chat box asking them to talk to a member of staff before their deactivation would be put through. 

As the title says, this tactic is called ‘roach motel’; a situation easy to get into, but hard to get out of. 


We’ve all seen it. Your mum’s neighbour’s cat’s birthday is on the weekend, and you’re expected to get a gift. Searching Google for ‘cat gifts’ gets you to a website you’ve never used. You click, happy the internet caters to every need. Then, a pop-up.

The goal here is, so transparently, to guilt us into giving our information away. And, like all the other tactics here, it works. If a frowning face instead of a speed camera is enough to make drivers slow down, a well-placed and shaming statement is more than enough to get us to do something as easy as writing out our email. Some of us even have it set to auto-fill. 

Graphic Design 

There’s this popular mobile game. To play, you connect dots of the same colour until you have no dots left. When you sign up, the button is green. When you press play, the button is green. When you move on to the next level, you guessed it, the button is green. That’s standard design; consistency, branding. 

Then something interesting happens. You aren’t able to complete the level, and you get a pop up. This time, the green button asks you to spend coins. With so much subconscious training that the green button takes you to the next step, it’s possible you click without even registering what it’s asking you to do. 

You’re probably thinking, does it matter? It might not, on a small scale. You’re an adult, and once you’ve realised that spending coins means spending real money, you can simply not do it. Right? 

Well, maybe. A lot of us have enough impulse control to move on once real money gets involved, but a lot of us don’t. Mobile games, social media apps, most places on the internet are developed with one singular goal: make money. 

Make money by keeping the user on it as long as possible. Keep the user on it as long as possible by making the user addicted to your service.


Addiction on the Internet

In the 1950s, a scientist named B.F. Skinner conducted an experiment. He put pigeons in a cage rigged up to deliver food when a lever was pulled. The experiment was set up to test the conditions under which the animal could develop an addiction.

Skinner found that when the food was delivered on every pull, or was delivered with large gaps in between, the animal would get bored and stop pulling the lever. In other words, not get addicted. However, when the pattern was random and in short bursts, the animal kept going. 

This pattern, called variable reinforcement, is the basis on which most social media platforms are built. Social media titan TikTok (and now also Instagram Reels and YouTube Shorts) use this unpredictability to keep us swiping. It’s the same tactic used by slot machines in casinos. 

This, alongside other addiction-forming tactics like infinite feeds and autoplay, are some of the most harmful UX strategies used on the internet. Hick’s Law, for example, says the more options available to the user, the longer it will take for them to make a choice. 

The dot-blasting game from above removed all other options. Spend coins, or don’t play. TikTok’s UI is great at doing this. There are only two feeds; explore, and ‘for you’. Both play one video at a time, have limited interaction options, and you move on by swiping. It even prompts you, on your first usage of the app, to ‘swipe for more’. 

Google’s former design ethicist, Tristan Harris, said, “[a social media app isn’t] designed to help us. It’s designed to keep us hooked.” 

The sinister implication of all this is simple. Despite increasing laws, website and app creators are clever. They’re using graphic design basics and principles of psychology to manipulate us into doing what they want us to do. And they always have been. 

The Takeaway

So where do we go with this? 

Beyond legislation being put in place to monitor (and hopefully stop) this sort of thing, the onus is a personal one. If it bothers you that companies are manipulating your habits, there’s a way you can stop. 

Take that extra 30 seconds to decline the cookies. Read before you click. Set a timer for app usage to stop yourself falling down the rabbit hole. 

Holding companies accountable for their choices, and informing others about them, is the best step forward. UX specialist Harry Brignull has been running his website Deceptive Designs for a few years. Its aim is to catalogue dark UX patterns in one convenient, easy to use place. 

If you come across any bad designs, make a note. Send them to Harry. Tell your friends. 

After all, nothing gets done if we don’t talk about it.