Dark Patterns: Regulating Digital Design

Policy Brief

Executive Summary

How easy it is to order a book on an online shop’s website, how intuitive maps or navigation services are to use in everyday life, or how laborious it is to set up a customer account for a car-sharing service, these features and ‘user flows’ have become incredibly important to the every customer. Today, the “user friendliness” of a digital platform or service can therefore have a significant influence on how well a product sells or what market share it gains. Therefore, not only operators of large online platforms, but also companies in more traditional sectors of the economy are increasing investments into designing websites, apps or software in such a way that they can be used easily, intuitively and as time-saving as possible. 

This approach to product design is called user-centered design (UX design) and is based on the observations of how people interact with digital products, developing prototypes and testing them in experiments. These methods are not only used to improve the user-friendliness of digital interfaces but also to improve certain performance indicators which are relevant to the business – whether it is raising the number of users who register as new customers, increasing the sales volume per user or encouraging as many users as possible to share personal data.

UX design as well as intensive testing and optimization of user interfaces has become a standard in today's digital product development as well as an important growth-driver for many companies. However, this development also has a side effect: Since companies and users can have conflicting interests and needs with regard to the design of digital products or services, digital design practices which cause problems or even harm for users are spreading.

Examples of problematic design choices include warnings and countdowns that create time pressure in online shops, the design of settings-windows that make it difficult for users to activate data protection settings, or website architectures that make it extremely time-consuming to delete an account. These examples are called "dark patterns", "Deceptive Design" or "Unethical Design" and are defined as design practices which, intentionally or intentionally, influence people to their disadvantage and potentially manipulate users in their behaviour or decisions. 

Most dark patterns are soft forms of influence, which hardly lead to significant disadvantages or damages for consumers and in many cases represent acceptable forms of sales practices. However, there are also dark patterns which can lead to unintended costs, users unknowingly agreeing to deactivate privacy protections or curbing consumer rights. Because dark patterns can be highly effective in influencing user behavior and are widespread, they have not only became a problem for consumers in everyday digital life but also a challenge for policy makers:

  • Dark patterns are amplifying the erosion of privacy as digital surfaces are often designed to encourage users to share as much personal data as possible. At the same time, specific design choices make it more difficult to protect personal data. In addition, especially in Europe, dark patterns systematically weaken the European Union’s privacy regulations because they undermine the principle of individual consent.
     
  • Dark patterns have become a challenge for consumer protection, as certain design practices mislead users or manipulate them into taking certain purchasing decisions. dark patterns have so far been studied and recognized as a widespread problem mainly in the e-commerce sector, but can potentially occur whenever something is sold or contracts are concluded. They can be found in computer games with payment elements, on travel portals, in the customer portals of telecommunications providers or on the booking websites of airlines.
     
  • Problematic design practices also represent a challenge for social-media and platform regulation. One example is the “Netzwerkdurchsetzungsgesetz” (NetzDG) – a German law which came into force in 2017 and required large social networks, among other things, to offer a form to flag and report unlawful content. In some cases, however, providers with millions of users took design decisions which made it hard for users to find and report offensive content, considerably reducing the regulatory effect of the NetzDG. Similar problems could occur in the future if governments seek to compel online operators to make algorithms more transparent for users, label online advertisement or include data portability options.
     
  • Whether dominant market players can gain a relevant advantage over competitors by means of manipulative or misleading design tactics is an open question. However it is becoming increasingly clear that product and interface design is a relevant issue in digital market competition which requires more attention from researchers and regulators. In some cases competition authorities already started to react and launched investigations into design practices deployed by large online operators.
     
  • Dark patterns also pose a challenge for the protection of minors, because children and adolescents are particularly vulnerable but still use the same user interfaces as adults.

In order to reduce the use and proliferation of problematic design practices, it is necessary that companies take greater account of consumer interests and rights when designing digital services and platforms. This will not happen by itself. Policy makers and governments need to respond by first and foremost making sure that severe forms of dark patterns are being sanctioned. Currently, this his is barely happening. Regulators and other relevant organisations should begin to launch investigations, start proceedings or initiate legal action against operators who use design practices that have a significant negative impact on a large number of people or which considerably undermine existing laws or regulations. 

To start taking action against severe forms of dark patterns, no new legislation or amendments to existing laws are needed initially. Instead, data protection authorities, youth protection authorities, consumer protection agencies or even competition authorities should begin to test which existing fields of law can be applied to deceptive or manipulative design practices and as well as used as a basis to launch complaints or start litigation. Ideally, design practices and their negative impact on users should be put at the centre of proceedings or court cases in order to set precedents as well as raise awareness for harmful design techniques. Sanctioning severe uses of dark patterns would require the provision of financial resources for regulatory bodies or consumer protection organizations.

Besides starting to apply and enforce existing laws policy makers should recognize that digital product design is a new and increasingly important regulatory field which governments should build up expertise for. This is a crucial step since other legislative processes and regulatory projects are due in the coming years, in which the importance of the design of digital platforms and services should be taken into account. Expertise is also needed to enable regulators to discuss solutions with companies on an equal footing and to develop effective, practical and create countermeasures themselves.

Published by: 
Stiftung Neue Verantwortung
May 13, 2020
Authors: 

Sebastian Rieger (srieger@stiftung-nv.de)
Caroline Sinders (csinders@gmail.com)