Dark patterns are deceptive user experiences that take advantage of people’s habits of using websites and apps to trick them into doing something they did not intend to do.

Dr Harry Brignull, an UX designer, coined the term on July 28, 2010, with the registration of the domain darkpatterns.org (now deceptive.design), a site devoted to naming and shaming deceptive user interfaces.

Brignull identified 12 categories of dark patterns:

  • Friend spam happens when someone asks for your contacts under false and good pretenses but then sends spam to them, claiming it’s from you
  • Forced continuity occurs when your free trial ends, and you get charged, so it is a hurdle for you to cancel
  • Ads disguised as different kinds of content or navigation within the website, so you click on them
  • Confirmshaming, which is an act of guilting the user into opting in for something by shaming them into compliance
  • Bait-and-switch is when you set out to do one thing for a given outcome, but a different and undesirable effect happens instead
  • Hidden costs, which make something appear cheaper only so at the last step of the checkout process, you are told of the additional charges
  • Roach motels, which are designed to make it very easy for you to get into a certain situation, but then hard to get out of it, like a subscription
  • Privacy zuckering, named after the Meta CEO, is used to trick you into publicly sharing more information about yourself than you want
  • Misdirection, which intentionally focuses your attention on one thing to distract you from something else
  • Price comparison prevention is when the retailer makes it hard to compare prices of items, so you cannot make an informed decision – thankfully, this is mostly outdated now.

Each of the above practices have now become recognized by various legislative bodies. For example, European Data Protection Board, which published a special guidelines of using dark patterns in social media platforms.

You may also like:


  • Privacy by design in practice: How “just enough” data beats “just in case” collection

    While collecting more data “just in case” feels safer, according to Matt Gershoff, it’s also one of the biggest sources of unnecessary compliance risk, analytical noise, and wasted organizational resources in the analytics industry today. His approach of “just enough” data collection is more intentional, more aligned with privacy regulation, and often more analytically effective.

  • 4 ways to make your analytics HIPAA-compliant: Implementation guide

    Healthcare organizations have four main approaches to achieving HIPAA-compliant analytics. Each has different trade-offs in cost, technical complexity, and analytics capabilities. This guide compares all four implementation methods – from using Google Analytics with workarounds to deploying fully HIPAA-compliant analytics platforms – so you can choose the right approach for your organization’s needs and resources.