John MacKenzie
16/12/2024
Reading time: four minutes
‘Dark patterns’ are interface designs intended to obfuscate certain processes and trick or coerce users into actions or omissions against their best interests, usually to the benefit of the creator of the dark pattern. This can include buttons tricking users into unintended choices, excessively lengthy processes to cancel subscriptions, filibustering in privacy policies, and default options tailored to harvest data from unsuspecting customers, such as through cookies.
Only being coined as recently as 2010 by Harry Brignull, dark patterns are an emerging issue in regulation. Regulating dark patterns is inherently complex, as they have implications for consumers, advertising, data, privacy and platforms.
While not all manipulative designs are dark patterns, dark patterns employ malicious choice architecture to lead consumers into additional purchases, site retention time or data harvesting. Dark patterns create financial and personal risks to consumers, potentially causing unnecessary costs through additional payments or subscriptions, infringements of their data, and limitation of autonomy.
While the California Consumer Privacy Act (CCPA) does not directly address dark patterns, the California Privacy Rights Act, which amended the CCPA, does specifically define them: “A user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making or choice.”
Section 5(a) of the Federal Trade Commission Act covers a number of deceptive practices, potentially indicating dark patterns. US regulators are increasingly focusing on dark patterns in consumer protection regulation, with specific proposals to combat dark patterns in subscription cancellations.
The Digital Services Act (DSA) imposes a novel prohibition on dark patterns, regulating the process of manipulating or “nudging” users, to protect freedom of choice. While the DSA doesn’t explicitly mention dark patterns, it does impose obligations on gatekeepers in relation thereto. Further, the Unfair Commercial Practices Directive (UCPD) and the Capital Requirements Directives both have potential to protect consumers’ interests when concluding contracts, and transparency obligations in commerce. Article 4 of the General Data Protection Regulation requires consent (such as for cookies), to be “freely given, specific, informed and unambiguous”.
Designs manipulating consumers’ entry into contractual relationships may fall under the purview of the Consumers Rights Act 2015. Further, the Competition and Markets Authority, the government’s competition regulator, has engaged investigations into dark patterns, such as one in 2022 into ‘urgency’ tactics. Transposing the GDPR, the Data Protection Act (DPA) 2018 could apply to dark patterns harvesting data. The DPA also provides transparency rights for consumers. Unlike the EU’s DSA, the relatively recent Online Safety Act 2023 in the UK makes no clear address of dark patterns or online coercion – this should be an area for future review.
Dark patterns are very common in cookie consent. Moving towards a framework for data protection by default could help, as in the proposed EU ePrivacy Regulation. This would eliminate websites defaulting to non-essential cookies, and would likely require prescriptive legislation, detailing exactly how cookie settings should be implemented on websites and browsers.
A significant problem to countering dark patterns is recognition: internet users are generally poor at identifying dark patterns, and the risks posed. Bongard-Blanchy et al recommends requiring warnings at the point of data collection, detailing risks.
A focus on creating ‘bright patterns’ – such as defaulting to the privacy-benefitting option in cookie selection – can also help to change the tide on dark patterns.
Existing regulations are not useless in fighting dark patterns, but need adjustments. For example, the UCPD can address dark patterns, but requires more consistent application. The DSA has been regarded by some as quite weak in its approach and scope – particularly pointing out how Article 23a makes it clear it only applies to online platforms, exempting a significant portion of potentially manipulative services. Additionally, many existing regulations may need to be updated to more effectively address online-specific issues.
As touched on above, dark patterns have a complex web of effects – as such, a pluralistic regulatory approach could be beneficial, allowing the strengths and weaknesses of existing mechanisms to be compensated.
As it stands, no one regulation is sufficient alone. Practical issues remain in regulating dark patterns, including the cost of enforcement, and the necessity of enforcement – small blogs or tiny online retailers might not be able to bear the cost of compliance, and arguably they lack any sufficient influence to really need to be regulated. Costs to the regulator might be offset by imposing heavy fines or sanctions on offenders, but frameworks are necessary to determine tiers to enforcement and compliance, as with the scaled rankings in the DSA (with the caveat that enforcement should then apply to more than online platforms).