By Shaun Lichti
August 26, 2022•7 min read
Coined in 2010 by UX designer Harry Brignull, dark patterns are user interface designs that attempt to mislead, coerce or pressure users into taking certain actions, such as making a purchase or providing consent. According to this study, these design tactics can include “unequal burdens on the choices available,” false choices, and deceptive or hidden information.
One of the most classic examples of dark patterns is the consent pop-up with a prominent ‘Accept all’ cookies button, but no clear way to reject that same tracking. Another is judgemental or negative messages on reject buttons, such as “No, I like paying more for less.”
And, dark patterns are very prevalent. According to a study on uninformed consent in Europe, over 50% of the cookie banners from 1,000 randomly sampled popular EU websites contained dark patterns. The patterns identified included “privacy-unfriendly defaults, hidden opt-out choices, and preselected checkboxes that enable data collection.”
Another study found that over 95% of the 200 most popular Android apps displayed one or more dark patterns.
Academic exploration aside, dark patterns have gained increasing significance in the world of privacy law and regulatory enforcement.
In January 2022, Facebook and Google received hefty fines in the EU for using dark patterns to create confusing and difficult cookie consent processes. And, as the latest round of CPPA rulemaking indicates, California (CA) regulators are following suit.
Additional resources
Dark patterns have been discussed directly and indirectly in three recent privacy laws: the California Consumer Privacy Act (CCPA), the California Privacy Rights Act, and the Colorado Privacy Act.
Though the CCPA doesn’t explicitly mention dark patterns, the concept did come up during the rulemaking process. In the Final Statement of Reasons (a document produced to accompany draft regulations), the California Attorney General stated that:
“it would run counter to the intent of CCPA if websites introduced choices that were unclear or, worse, employed deceptive dark patterns to undermine a consumer’s intended direction.”
Though this reference is far from an actionable requirement, it was a clear nod to CA regulators’ stance on dark patterns. Aside from this somewhat oblique statement, the concept wasn’t officially defined or explored further until the CPRA.
Amending the CCPA, CPRA references dark patterns explicitly, stating that:
“...agreement obtained through use of dark patterns does not constitute consent.”
It goes on to define dark patterns as:
“a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice, as further defined by regulation.”
A significant expansion from CCPA, this new language reflects a willingness to address design concepts that, though sometimes hard to define, have a clear effect on consumer choice.
More importantly, it opened the door for California regulators to explore and legally define the concept of dark patterns in detail—setting up an environment in which enforcement is a viable, even likely outcome. We’ll cover this new language in more depth below.
Alongside CPRA, the Colorado Privacy Act also makes explicit mention of Dark Patterns—prohibiting their use when obtaining consent. Under the CPA, dark patterns are defined as:
“a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice.”
With the CPA set to go into effect on July 1, 2023, companies should begin reviewing their current interfaces and ensure any new designs are dark-pattern free.
The proposed CPRA regulations and Initial Statement of Reasons (ISOR) offer the most in-depth look to date at how regulators are approaching dark patterns, identifying a number of principles interfaces must follow when used for CCPA consumer request submissions and obtaining consumer consent.
Failure to adhere to these principles would be considered use of dark patterns— invalidating any consent obtained and meaning the company has violated CPRA.
This one is pretty straightforward. But, it’s important to note that ease of understanding should be applied to both the language used and the design itself. Across all regulations, the guidance here is clear: use plain language that tells the user exactly what their choices are and how to act on them. This means no legalese, double negatives, or any other form of semantic sleight-of-hand.
The same goes for design. An interface should be designed so that the choices and their outcomes are abundantly clear.
Symmetry in choice means that exercising a ‘more privacy’ option shouldn’t be longer or more difficult than a ‘less privacy’ option.
One of the most common examples of asymmetry in choice are website cookie banners that offer “Accept all” and “More information” as the only two choices. Banners designed in this manner are not compliant, as the privacy-protective choice requires additional steps and clicks.
The regulation also gives “Yes” and “Ask me later” as an example of a prohibited asymmetrical choice because there is no option to decline the opt-in.
Another common dark pattern (sometimes called a “roach motel”) is an interface design that makes it easy to get in but hard to get out.
In July 2022, Amazon Prime has came under scrutiny for this particular dark pattern in the EU. A report from Norway’s Consumer Council, entitled ‘You Can Log Out, But You Can Never Leave,’ stated that:
“Consumers who want to leave the service are faced with a large number of hurdles, including complicated navigation menus, skewed wording, confusing choices, and repeated nudging.”
Following this report, a series of coordinated complaints were filed and, in July 2022, Amazon agreed to change the Prime cancellation interface for EU member states—streamlining the process down to two clicks.
But it’s not just the language or clicks that matter. The regulation also noted that the ‘less privacy’ option cannot be presented more prominently—reflecting the fact that regulators are considering not just the length of a path, but how options are presented. For example, the ‘Yes’ button cannot be bright blue, while the ‘No’ is muted gray.
Double negatives are one clear example of confusing language i.e. placing “Yes” or “No” next to the statement “Do Not Sell or Share My Personal Information.” In that context, it’s not immediately clear that by saying “No” someone would actually be agreeing to having their personal data shared or sold.
One way coercive interactive elements show up in the wild are buttons that change based on your toggle choices. For example, if you have toggled everything to accept all tracking cookies, you’re shown a button that says “Confirm my choices.” But, if you un-select certain cookies, the button changes to say “Allow all” and the “Confirm my choices” button is now below-the-fold—meaning you have to scroll down to see it again.
These interactive elements are a clear manipulation—it would be easy for a user to miss the button switch and, having wanted to opt-out of certain tracking, accidentally opt into everything by clicking the newly-appeared “Allow all” button.
CPRA regulations explicitly prohibit the use of guilting or shaming language. For example, when offering a discount or other financial incentive in exchange for consent, it’s considered manipulative and shaming to offer as options “Yes” or “No, I like paying full price.”
Bundling reasonably expected purposes and additional purposes into the same consent interface is also prohibited. For example, if you offer a location based app you can’t bundle use of your core service with consent to the sale of the consumer’s geolocation data.
Don’t have additional manual steps as part of the consent or opt out flow. Ensure you have a seamless and fully functional Consent Manager in place that can facilitate opt-out flows without adding friction.
For example, if a consumer is looking to opt-out of the sale of their personal data, don’t make them search or scroll through the text of a privacy policy or webpage after clicking the “Do Not Sell or Share My Personal Information” link.
Circular or broken links, nonfunctional or unmonitored email addresses, or other broken experiences are also not allowed. Again, if you’ve previously relied on manual and/or piecemeal solutions this may be a good time to reevaluate putting a seamless and easy-to-use solution in place.
Once more of an academic concept, many companies applied the "wait and see" approach on the topic of dark patterns—in many cases, continuing to employ dark patterns throughout their purchasing and consent interfaces. The CPRA draft regulation released by the CPPA make it clear this is no longer an acceptable status quo.
With clear guidelines for compliance, a new regulatory body in California, and come January 1, 2023, no more buffer period (the CPRA rescinded the automatic 30 day cure period), companies need take steps soon to ensure their interfaces are free of dark patterns.
This tips above are a great starting place, but implementing a functional Consent Manager that facilitates opt out for users while supporting adherence to signals like Global Privacy Control is the best way to ensure compliance now and in the future.
Transcend is the platform that lets companies put privacy on autopilot, making it easy to encode privacy across an entire tech stack.
Transcend Consent ensures nothing is tracked without user consent — with out-of-the-box compliance with popular Do Not Sell signals like Facebook's LDU, seamless compatibility with third-party widgets and unlimited UI customization. All while keeping your site blazing fast.
Our technology helps companies achieve scalable privacy compliance and build strong relationships with their customers through comprehensive data transparency, consent, and control.
By Shaun Lichti