The U.S. Federal Trade Commission (FTC) recently filed a complaint stating that, for years, Amazon has “knowingly duped” consumers into signing up for Prime subscriptions and then complicated their attempts to cancel.
The FTC alleges that Amazon “tricked and trapped people into recurring subscriptions without their consent” through manipulative, coercive or deceptive interface design tactics, known as “dark patterns.”
Internal documents reveal that Amazon code-named the drawn-out process of cancelling Prime as “Iliad.” This, as the complaint points out, alludes to the ancient Greek epic about the long and arduous Trojan War.
Our work reveals how dark patterns play a key role in keeping users active on social media — despite their intentions and efforts to leave.
Anyone who uses the internet has almost certainly encountered dark patterns. The term was coined by Harry Brignull, a user-experience consultant in the United Kingdom, who began compiling examples of problematic design practices in 2010. Brignull’s “Roach Motel” dark pattern specifically describes cases where online service providers make it easy to get into a situation but hard to leave.
Difficult-to-cancel subscriptions have drawn heightened attention from regulators, but they hardly represent the only situation where online service providers deliberately deter users from cancelling or leaving their services.
In our research in user-experience design, we found that social media sites also routinely make it difficult — or even impossible — for users to disable their accounts.
Uncovering common dark patterns
The Language and Information Technology Research Lab (LiT.RL) at the University of Western Ontario studies deceptive, inaccurate and misleading information practices. We collected data from 25 social media sites, drawn from a list of the 50 most popular ones in May 2020.
We then used content analysis to review the account-disabling process for each site screen-by-screen, including the options given (or hidden) from users, and the exact wording and visuals shown. We wanted to establish which strategies were used to deter users from leaving these sites and how prevalent they were. Our research is currently undergoing peer review in a journal dedicated to social media and societal issues.
In total, our study uncovered five major types of dark patterns — Complete Obstruction, Temporary Obstruction, Obfuscation, Inducements to Reconsider and Consequences — and 13 subtypes, specifically associated with disabling social media accounts.
Like the Amazon Prime cancellation process described in the FTC’s complaint, these strategies were rarely deployed in isolation: the sites in our sample used 2.4 dark patterns on average, and five sites contained five or more dark patterns to deter account disabling.
One site simply provided no option in the interface for the user to disable their account, and warned that requests for account disabling would not be considered by the site administrators (Complete Obstruction).
Nine sites obstructed the path to account disabling by burdening the user with unnecessary work, such as chatting to a company representative in real time or responding to an email to confirm their decision to leave (Temporary Obstruction).
Seven sites confused or mislead the user by, for instance, hiding the button to initiate the account disabling process in an unusual location or making the button itself small and faint (Obfuscation).
Fifteen sites relied on more transparent efforts to convince the user to reconsider, often by employing language and visuals that induced fear, guilt or doubt – such as sad faces, large red “warning!” labels, and proclamations that “it would be a shame to see you go!” (Inducements to Reconsider).
Even if the user was able to successfully disable their account, they were frequently confronted with opportunities or pressure to return (Consequences). Twelve sites continued to communicate with the user via email or offered account reactivation for a fixed period; one site made reactivation possible for the exorbitantly long period of a year.
Even worse, four sites offered account reactivation indefinitely, meaning that the account and its associated data could never be permanently deleted.
While people may wish to cancel an Amazon Prime subscription to avoid unwanted fees, motivations to quit social media are more complex. Research shows that users report a range of reasons to leave social media, including concerns over privacy, addiction and diminished well-being. Another study found that “at least 35.5% of [social media] account deletion attempts did not end in a deleted account.”
Our research can help people resist the dark patterns thwarting their attempts to quit social media in several ways.
First, drawing attention to these practices can inform users of common strategies and recommend helpful resources. The website Just Delete Me, for instance, collects direct links to account-disabling pages for numerous online services.
Second, following the increased scrutiny that unwanted subscriptions have faced from the FTC as well as the European Commission, regulators should pay greater attention to dark patterns in the context of account disabling.
Based on our research, we suggest that sites should adopt a simple two-step account disabling process where users click an easy-to-locate button and finalize their choice through a neutral confirmation screen or by entering their password.
These recommendations are in line with “click to cancel,” proposed by the FTC in March 2023. The proposed rules envision a simple cancellation mechanism for subscriptions, and elimination of sales pitches or modifications in the cancellation process unless the consumer explicitly agrees to hearing them.
Increasingly, regulators are recognizing that websites deploy subtle design tactics to keep users trapped in unwanted services. While fighting to eliminate the dark patterns that abound online is a complicated task, the FTC’s complaint against Amazon represents a clear step in the right direction.
Dominique Kelly, Doctoral Candidate, Information and Media Studies, Western University and Victoria L. Rubin, Associate Professor & Director of the Language & Information Technology Research Lab (LiT.RL), Western University
*** Expert Insight reflects the perspective and scholarly interest of Western faculty members and is not an articulation of official university policy on issues being addressed.