Dark Patterns are defined by the European Data Protection Board (EDPB), as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”. These dark patterns seek to influence user behaviour on those platforms, hindering their ability to make conscious choices which effectively protect their personal data. Data protection authorities are responsible for sanctioning any use of these dark patterns if they breach GDPR requirements. These dark patterns include overloading, skipping, stirring, hindering, fickle designs and leaving users in the dark.
Overloading
Overloading refers to situations in which users are confronted with an unreasonable amount of requests, or too much information, options or possibilities which encourages them to share more data that necessary or unintentionally allow personal data processing contrary to the expectations of the data subject. Overloading techniques include continuous prompting, privacy mazes and providing too many options.
Skipping
Skipping means designing the interface or user experience in a way that users forget or do not think about all or some of the data protection aspects. Examples of dark patterns which result in skipping include deceptive snugness and “look over there”.
Stirring
Stirring is a dark pattern which affects the choice users would make by appealing to their emotions or using visual nudges. This includes emotional steering and pertinent information being “hidden in plain sight”.
Hindering
Hindering refers to the obstructing or blocking of users becoming informed or managing their data by making the process extremely hard or impossible to achieve. The dark patterns are considered hindering include dead end designs, longer than necessary processes and misleading information.
Fickle Interfaces
Fickle interfaces are designed in an inconsistent and unclear manner, making it hard for the user to navigate the various data protection control tools and understand the purpose of the data processing. These interfaces include those lacking hierarchy as well as those which utilize decontextualising within the design.
Interfaces that leave users in the dark
An interface is considered to be leaving users in the dark if the interface is designed in a way that hides information or data protection control tools and leaves users unsure of how their data is processed and what kind of control they might have over it regarding the exercise of their rights. Examples of this include language discontinuity, conflicting information and ambiguous wording or information.
We recently published a short vlog on YouTube outlining the types of dark patterns, how the GDPR principles, if adhered to, can prevent the design of your user interface from falling into these dark patterns, and what measures should be given special attention to avoid being sanctioned.
Subscribe to Aphaia’s YouTube channel for more information on AI ethics and data protection.