Is it a Dark Pattern or is it just me?

It's Friday. It's been a long week at the end of a seemingly endless project sprint, and you are weary from demos and catching production bugs to hotfix on the fly. Scrolling through your emails, you see a new message pop up in your inbox. "Thank you for your purchase" says Company X, and you feel your heart sink impossibly far down.

"Didn't I cancel that trial?" you ponder (with additional expletives) as you access their site.

It turns out perhaps you had started cancelling but been distracted, or upsold to, or been unable to locate the final "confirm" button, or not even been aware there was a final "confirm" button. Perhaps you had put a reminder to cancel in your diary but the hubbub of life had intervened and you hadn't checked or caught it in time. Or perhaps you weren't even aware that it was going to automatically renew because you hadn't read or fully engaged with the reams of small print accompanying the sign-up process.

Joke's on you, eh? The proud owner of an annual subscription to that app you only opened twice and then forgot about.

Don't beat yourself up too much - we've all been there. More often than not you have fallen victim to what is called a UI "Dark Pattern", and the cognitive manipulation at play here would be remarkable if it weren't so downright ugly. But what can we do about it?

What is a "Dark Pattern"?

Normally we think of bad design as consisting of laziness, mistakes, or school-boy errors. We refer to these sorts of design patterns as Antipatterns. However, there’s another kind of bad design pattern, one that’s been crafted with great attention to detail, and a solid understanding of human psychology, to trick users into doing things they wouldn’t otherwise have done. This is the dark side of design, and since these kind of design patterns don’t have a name, I’m proposing we start calling them Dark Patterns.

Source: Harry Brignull, 2010. Dark Patterns: dirty tricks designers use to make people do stuff | 90 Percent Of Everything

Dark Patterns tap into a number of our (often subconscious) cognitive hooks and range from the straightforward but annoying "clickbait" (see the title of this article 😉) right through to sneaking items into your basket because you didn't check the right boxes in the neverending small print.

What sort of patterns do we fall victim to?

Here are just a few:

  • Bait and switch
    The UI indicates one thing, but you're actually about to unknowingly trigger a completely different process. Thought you were declining cookies? Nope, you're actually now in a process to approve or reject every tracker on the site and they're going to judge you heavily for using an adblocker to boot.
  • Confirmshaming
    Want to feel faintly guilty for declining a company's newsletter? Forced to click: "No, I don't want to be awesome."
  • Roach motel
    Easy to start a subscription, but not so easy to get out... Turns out you have to ring them to cancel. On a premium number. On the third Tuesday of every month between 10:41 and 10:42.
  • Forced disclosure
    Why wouldn't we need your postal address to download this PDF? Or your telephone number to order a digital product?
  • Road block
    You've clicked on an article link - but first you need to accept all of our cookies. And how about our newsletter? No? Then maybe you should turn your adblocker off. And are you sure you couldn't be tempted by our most recent promotional offer? Okay, here's the article. P.S. 80% of it is behind a paywall.

Source: Dark Patterns, (2020). Types of Dark Pattern

As an internet user they are endlessly frustrating regardless of whether you are aware of them or not, and constantly erode your focus and trust. Unfortunately, they are also exceedingly effective which makes them rather profitable, and the internet offers increasingly creative ways to play with our cognitive needs and biases.

However, the impact of using Dark Patterns can range beyond the scope of the "annoying" and actually be quite damaging depending upon the situation. The damage they can do ranges from unexpected financial outgoings (losses) to compromising the integrity of your personal data1 to exacerbating poor mental health through the careful application of stress and anxiety2.

At the heart of them, Dark Patterns are prioritising business needs at the expense of the wellbeing of the user. But what sort of company could be that ruthless and what responsibility should we, as developers, shoulder...?

What are the costs of this for a company?

There have been times in my career where the implementation of business requirements has left behind a bad taste. In the early days of being a junior developer (long before the advent of GDPR) a client requested that data be captured from forms regardless of whether they were submitted or not so that contact details could be used as sales leads. There was ensuing discussion around this feature request, where it fell on the legality scale, and then an ultimate pushback to the client.

In that case, it was the right outcome; but I always felt that the decision came from a place of fear rather than a place of strong ethical boundaries - and this culture of playing fast and loose with user data and manipulating user intention pervades throughout the digital industry as we know it today. You need only look to the likes of Cambridge Analytica or check your email on to be aware that, as a user, your data is more valuable than your experience.

Furthermore, even when a massive company such as Facebook does improve user experience on their products, this is not always to benefit the user. Aral Balkin uses a fitting analogy:

[experience improvements are] the massages we give Kobe beef: they're not for the benefit of the cow but to make the cow a better product. In this analogy, you are the cow.

Source: Aral Balkan, Founder of Small Technology Foundation

Unfortunately, the web has become a low-trust environment for many users after years of being manipulated and taken advantage of, and this low level of trust impacts the way that we use the internet - we expect gotchas. But that definitely doesn't mean that we should.

In recent years, there have been systematic steps in the right direction. The advent of GDPR3 has been transformational in forcing agencies, developers and clients to actually ask the questions like: where will we store this data? How will it be accessed? Do we actually need it? Which, although not a silver bullet, is at least shining a light in dusty data corners that were prime for exploitation.

Furthermore, in 2019 the Competitions and Markets Authority cracked down on hotel booking sites and called on them to discard certain sketchy patterns that had permeated the industry for years such as:

  • manipulating search results based on the amount of commission a hotel pays the website
  • giving inaccurate impressions of the availability or popularity of a hotel in order to create a sense of urgency in the customer
  • promoting deals regardless of whether they are actually available at that time
  • hiding compulsory charges such as taxes, booking or resort fees from the headline price to make it look more competitive

Source: Competition and Markets Authority, (2019). Hotel booking sites to make major changes after CMA probe.

One of the emerging qualities of the internet is that of creating a level playing field between individuals and entities, and this has shifted the dynamic between corporations and customers. Gone are the days of the silent disgruntled customer and in their place the internet has empowered consumers to band together to pool their power; given them a voice to share honest (and sometimes damning) reviews of products and services; and surfaced more choice than any sane person could ever possibly consider for every option. From toasters to train journeys, there is very little the internet can not firstly source and then furnish you with other people's opinions on.

This means that the stakes for companies who engage with dark patterns in a bid to manipulate their customers gets increasingly high. Aside from the financial costs of handling complaints, handling returns or refunds, and, in the worst-cases, handling legal challenges, they also have to contend with the slippery concept of reputation. Where once a company's reputation was predominantly under their control and - through the lens of marketing and media stories - could be curated from the safety of a distant tower, the state of the interconnected world means that a company's reputation is part of an ongoing online conversation and it is weighed not just on their products and services, but also on their values, decisions, and to what ends they apply their corporate power. With the rise of choice and consumer-power comes an appetite for brands to be accountable for their impact on humanity, not just their customers, and the cost of non-compliance is their reputation.4

What does this mean for you?

"But," I hear you say, "I just work here. I'm just a small cog in the chaotic hyperspace that is the internet."

Or something like that, anyway.

For every person that holds a role in creating and maintaining bits of the internet, the aforementioned "low-trust web" is a backdrop for everything we create or achieve. Our achievements are only as good as the ecosystem within which we exist. The best intentions and technical implementations in the world are not going to erase decades of damage caused by manipulative UIs or the dangers of privacy breaches or the impact of poorly-conceived, biased data algorithms.

The nature of how we work and the cultures that we work in have a direct correlation with what we output, not just in terms of technical quality but also in terms of how company values are reflected in the project lifecycle. A company that values a diverse audience, stability of product and is open and inclusive in their structure and processes will prioritise accessibility, testing and iterative design over quick wins and forcing deadlines regardless of the cost to the individuals involved. As a result they will produce quality applications that don't need to rely on shady manipulative tactics to bury customers: the respect shown to individuals within the project lifecycle will be extended to the end-users.

In his excellent article criticising the use of tracking pixels in the email client Superhuman5, Mike Davidson references the concept of an "ethical trajectory" to define how a company tracks a path through ethical decision-making, and how one sketchy decision can inform a chain of increasingly dark patterns. He postulates that when any company employee is called upon to make a decision on an ethical matter, the most important thing that they look for is precedent - how does this match up with other company decisions? If the decision is later held to be wrong, what defence can be used to protect the employee: what can they point to to say "because of X, I decided to do Y"? If previous decisions are tenuously-ethical, it opens the door for more morally ambiguous decision making, and often this ambiguity is passed on to the end user. In the case of Superhuman this means encouraging the tracking of email recipients often without their knowledge or consent.

In other professions, such as engineering or surveying, there is a core, recognisable set of ethics that governs individuals and the larger companies that they are a part of. Having an agreed set of industry ethics means that decisions do not have to rely on precedent or individual accountability. Web development sits at an overlap of artificial intelligence, company operations, marketing, advertising, design and information gathering; governing all of these areas simultaneously is certainly a challenge. However, not impossible, and although there needs to be deeper seated systemic changes there also needs to be an appetite for it at the individual web developer level.

The movement towards focusing on application accessibility is a push in the right direction by prioritising the most vulnerable users and normalising these design and coding practises. To be most effective, accessibility patterns need to be part of the core decisions when planning and implementing business requirements rather than an additional workload on top of the project lifecycle.

Furthermore, the rise of open source software means that transparency is becoming easier, and with transparency comes accountability. There is a name publicly associated with a pull request or code block or technical choice - it might even be your name - so are we seeing a shift in the right direction? Can it carry over culturally to closed-source corporate projects? Or is it simply icing on a cake with a rotten core?

At the moment, the choice falls to us - the implementors - to understand what is being asked and how it might fundamentally influence the Overton Window of what is acceptable or extreme in web development ethics.


[1] Paternoster, L. (2018) Getting round GDPR with dark patterns. A case study: Techradar.

[2] Boag, P. (2020) How to Encourage Clicks Without Shady Tricks – Smashing Magazine AG,

[3] Lawful basis for processing: Consent.

[4] Falbe, T, Andersen, K, Michael Fredricksen, M. (2017) White Hat UX – Smashing Magazine AG,

[5] Davidson, M. (2019) Superhuman is Spying on You » Mike Industries

Laura Weatherhead

Laura is a full-stack Developer at Spun and works on a contract-basis out of Canterbury, UK. She has been working with Umbraco since 2012, a Candid Contributions podcast host and Umbraco MVP. When she's not being challenged by her computer she is an occasional amateur acrobat and infrequent artist.

comments powered by Disqus