What are Dark UX Patterns?
Dark patterns are design elements intentionally crafted to trick and manipulate people into taking actions they didn't intend to.
You'll come across them almost every day.
They use psychology to bypass principles like reactance (which is the resistance people feel when they perceive that their freedom of choice is being restricted).
In this article, we'll shine a light on dark patterns, providing examples and insights on how to steer clear of them. And also, what to do instead to get long-lasting results.
Why might people use Dark UX Patterns?
They work. At least in the short term. And many people focus on quick metrics to show success.
Boosting this months turnover might look great but what if these tactics backfire? Massive fines, bad reviews and mistrust. These newly gained customers flocking to the competition because they no longer trust you.
Long-term these tactics can cost a lot. So we advise being aware of them, and avoiding them.
Examples of Dark Patterns in UX
Misdirection | Rating: 3/5
Using visual hierarchy, styling, and design elements to lead users away from their intended choices or towards choices that benefit the business.
Users cannot select cheaper pricing plans when signing up
Must first sign up for expensive free trial then search for downgrade option
Deliberately obscures more affordable options
Hidden Costs | Rating: 4/5
Concealing or obfuscating additional charges until users are emotionally invested in the purchase process.
Example: "GameStop sued for adding hidden charges during checkout"
Users selected "FREE Shipping" but were charged anyway
Class action lawsuit seeking $5M in damages
Hidden charges revealed only at final stages
Bait-and-Switch | Rating: 5/5
Advertising one thing but delivering another, often after significant user investment.
Example: "ChatGPT forces users to compromise on privacy by linking it to reduced functionality"
Free tier shows responses but requires Plus subscription to continue
Privacy opt-outs unnecessarily linked to reduced functionality
More flexible control hidden and difficult to find
Roach Motel | Rating: 5/5
Making it extremely easy to get into something but deliberately difficult to get out.
Example: "Apollo makes its hard to cancel it's subscription by redirecting it to an unresponsive chatbot"
Cancel button triggers unresponsive bot chat
No direct way to cancel subscription
Support deliberately unresponsive
Forced Continuity | Rating: 5/5
Making service cancellation deliberately complex while auto-renewal is seamless.
Example: "Shutterstock deploys recurring annual plans or termination with massive penalty"
Users unknowingly signed up for annual subscription through dark patterns in checkout
Charged $60 penalty to cancel even if service unused for months
No clear indication during signup of annual commitment
Renewal terms buried in fine print
Privacy Zuckering | Rating: 5/5
Making privacy settings intentionally complex or unclear.
Example: "Meta pushes user to 'consent' to unrestricted use of their data"
New consent wall undermines EU Court ruling
Pushes unrestricted data use through confusing interface
Complex opt-out process
Disguised Ads | Rating: 4/5
Making advertisements indistinguishable from regular content.
Example: "X rolls out new ad format that can't be reported, blocked"
Ads not connected to actual X accounts
No disclosure that they are advertisements
Cannot be blocked or reported
Sneak into Basket | Rating: 4/5
Adding items or services to shopping carts without explicit user consent.
Example: "Bundle Hunt adding hidden costs during checkout"
Additional items automatically added to cart
No clear notification of additions
Difficult to remove items
Friend Spam | Rating: 3/5
Leveraging users' social connections without clear consent.
Forces contact sharing prompt when opening app
No visible way to decline sharing contacts
Obscures the ability to say "no"
Makes it appear mandatory for app functionality
Confirm-shaming | Rating: 3/5
Using guilt or social pressure to manipulate user choices.
Example: "IndiGo manipulating emotions of users when booking flights to opt for travel insurance"
Uses emotionally manipulative language
Shames users who don't select insurance
Creates artificial fear of risk
Hidden Opt-Out | Rating: 4/5
Obscuring or complicating ways to decline optional features or sharing.
Example: "Samsung tricks users to accept all its new terms and conditions including optional ones"
Hides opt-out options in complex menus
Combines mandatory and optional terms
Unclear consequences of choices
Pre-checked Opt-Ins | Rating: 2/5
Defaulting to opted-in states for optional features or communications.
Example: "Air Canada forces its customers to receive promotional emails"
No way to opt out of marketing emails
Required for purchase completion
Bundled consent
New Patterns Identified for 2024
Emotional Exploitation | Rating: 5/5
Using psychological triggers and emotional manipulation to drive user behaviour.
Example: "DoorDash now warns you that your food might get cold if you don't tip"
Uses guilt to manipulate behavior
Creates artificial anxiety
Exploits immediate emotional responses
False Urgency 2.0 | Rating: 4/5
Creating artificial time pressure and scarcity to force quick decisions.
Example: "Select Blinds fined $10M for fake slashed prices, perpetual discounts and fake countdown timers"
Fake countdown timers
False scarcity messaging
Manipulated pricing displays
Subscription Trap | Rating: 5/5
Creating complex subscription models with hidden penalties and unclear terms.
Example: "Miku Inc. tricks users into a monthly subscription"
$400 baby monitor locked previously free features behind monthly subscription
Basic functionality like push notifications for baby waking now costs $10/month
Users who bought hardware outright suddenly required to pay subscription for core features
No warning of feature removal during hardware purchase
Essential safety features moved behind paywall after purchase
Data Privacy Deception | Rating: 5/5
Misleading users about how their data is collected, used, or shared.
Example: "Google to pay $93m in settlement over deceptive location tracking"
Continued tracking after opt-out
Misleading privacy controls
Hidden data collection
Interface Interference | Rating: 4/5
Manipulating user interface elements to force specific behaviors.
Example: "Kayak automatically re-enables push notification settings after user disables it"
Settings reset without consent
Persistent notification prompts
Override user preferences
Hidden Mandatory Account Creation | Rating: 4/5
Forcing users to create accounts for basic functionality that shouldn't require one.
Example Title: "HP printers should have EPEAT ecolabels revoked, trade group demands"
Required account creation to use basic scanner functionality
Forces users to provide personal information for offline features
Makes basic hardware features contingent on online accounts
Cancellation Complexity | Rating: 5/5
Making service cancellation unnecessarily complex and time-consuming.
Example: "Go on a date with a reporter to cancel your Wall Street Journal subscription"
Required phone calls for cancellation
Multiple confirmation steps
Hidden cancellation process
Price Comparison Manipulation | Rating: 4/5
Manipulating how prices and plans are displayed to influence purchasing decisions and obscure true costs.
Example: "Asana uses per-seat pricing dark pattern to obscure total costs"
Pricing not clearly displayed upfront
Forces administrators to make individual seat decisions
Hides total cost until after significant user investment
Manipulates comparisons between pricing tiers
Makes it difficult to estimate actual organizational costs
What to do instead
People value trust, honesty and transparency more than ever. They will pay more and become more loyal to a brand they feel understands them, and isn't out to trick them.
With that in mind here are more ethical strategies you can implement instead.
Clear and Transparent Information
Provide users with clear and transparent information about the consequences of their actions. Make sure they understand what will happen if they choose a certain option.
Progressive Disclosure
Gradually reveal information or features as the user interacts with the interface. This prevents overwhelming users with too many choices at once.
Provide Meaningful Options
Offer users choices that are genuinely useful and relevant to them. Avoid presenting options that only benefit the business.
Personalisation
Tailor the user experience to individual preferences and behaviour. This can make users feel more in control and reduce reactance.
Empowerment Through Customisation
Allow users to customise their experience. This gives them a sense of ownership and control over the interface.
Encourage Informed Decisions
Provide educational content or tooltips that help users understand the implications of their choices.
User-Centred Defaults
Set default options that align with what most users would likely choose if they understood the implications. Allow users to change these defaults easily.
Use Social Proof Responsibly
While social proof (e.g., showing how many people have taken a certain action) can influence decisions, it should be used ethically and truthfully.
A/B Testing with Ethical Considerations
If you're conducting A/B testing, make sure the variations are ethically designed and don't exploit users' cognitive biases.
Feedback and Confirmation
Always provide clear feedback after a user takes an action, and give them the opportunity to confirm their choices before finalising them.