
New regulations to the California Consumer Privacy Act (CCPA) took effect in March that prohibit businesses from using on their websites “dark patterns” that make it difficult for California consumers to opt out of the sale of their personal information.
A dark pattern is a potentially manipulative user interface design that can have the effect, intentionally or unintentionally, of altering decision making—such as in relation to making purchases, influencing cancellations or subscriptions, registering consent choices, or encouraging data sharing.
Although dark patterns have been around for many years, even when they are noticed they have been largely shrugged off by consumers due to the general take-it-or-leave-it nature of website and apps usage.
However, recently this practice is coming under increasing legal and regulatory scrutiny, such that online operators and designers should take note that, more than ever, subtle manipulation can result in enforcement action and reputational harm. New privacy laws and proposals are dictating that digital form must follow function.
Flavors of Dark Patterns
Dark patterns come in many varieties. In a recent award-winning research paper, Arnuesh Mathur and his co-authors analyzed the presence of dark patterns across 11,000 shopping websites.
They found five primary types of dark patterns: (1) asymmetric, where a user interface (UI) emphasizes particular choices more than others (such as with color or font size); (2) covert, where a UI steers users to make certain purchases or choices without their active knowledge; (3) deceptive, where the UI employs misleading statements or omissions to induce false beliefs; (4) hides information, where the UI obscures information or delays its presentation; and (5) restrictive, where the UI restricts the number of choices available to the user.
There are other classifications of the types of dark patterns as well. The site darkpatterns.org catalogs and provides examples of patterns such as “trick questions” that fool users into providing an answer they did not intend, including:
- “sneak into basket,” whereby items are added to a shopping cart through the use of an opt-out button or checkbox on a prior page;
- “misdirection,” which purposefully focuses attention on one thing to distract attention from another;
- “disguised ads,” which sees advertisements disguised as other kinds of content or navigation in order to get users to click on them; and
- “friend spam,” when a product asks for a consumer’s email address or social media permissions under the guise of a desired result (e.g., finding friends), but then spams their contacts in a message that claims to be from the user.
State Privacy Laws Are Starting to Step in
Consumer groups, policymakers, and regulators have recently increased their focus on dark patterns. California’s new rules prevent the use of double negatives when providing consumers choice and disallow requiring consumers to provide personal information that is not required for implementing their opt-out request.
In a press release, the California attorney general clarified that the new ban also “prohibits companies from burdening consumers with confusing language or unnecessary steps such as forcing them to click through multiple screens or listen to reasons why they shouldn’t opt-out.”
The successor to the CCPA, the California Privacy Right Act, which takes full effect on Jan. 1, 2023, goes even further, directly defining “dark pattern” and clarifying that any “agreement obtained through the use of dark patterns does not constitute consent.” The much-watched Washington Privacy Act would do precisely the same for Washington State, if passed.
The Federal Privacy Top Cop Is Watching, Too
The Federal Trade Commission has also signaled concern with the increased use of dark patterns and their impact on consumers, and will be hosting a virtual workshop on April 29 dedicated to examining digital dark patterns.
The session will evaluate how dark patterns differ from sales tactics employed by brick-and-mortar stores; their potential harms on consumer behavior; whether some groups are unfairly targeted or are especially vulnerable; the laws, rules and norms that regulate the use of dark patterns; and, notably, whether additional rules, standards or enforcement efforts are needed to protect consumers.
The FTC is no stranger to dark patterns, but its recent attention to this issue may signal more dark pattern enforcement actions in the near future. In a September 2020 statement, then-FTC Commissioner Rohit Chopra called out dark patterns as “online sleight of hand using visual misdirection, confusing language, hidden alternatives, or fake urgency to steer people toward or away from certain choices.”
He further contended that the agency needs “to methodically use all of our tools to shine a light on unlawful digital dark patterns, and we need to contain the spread of this popular, profitable, and problematic business practice.”
Although it remains to be seen exactly what types of enforcement actions, bans, or lawsuits will play out in the months and years ahead, it seems clear that businesses should be aware that consumers, regulators, and lawmakers are more focused than ever on website design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service.
This column does not necessarily reflect the opinion of The Bureau of National Affairs, Inc. or its owners.
Write for Us: Author Guidelines
Author Information
Gretchen A. Ramos is global co-chair of Greenberg Traurig LLP’s Data, Privacy & Cybersecurity Practice and is based in the San Francisco office.
Darren J. Abernethy, of counsel at Greenberg Traurig LLP, is a member of the firm’s Data, Privacy & Cybersecurity Practice and is based in the San Francisco office.