Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites (note, there’s lots of them influencing your unconsious to buy!)

Dark patterns are user interface design choices that benefit an online service by coercing, steering, or deceivingusers into making unintended and potentially harmful decisions. We present automated techniques that enableexperts to identify dark patterns on a large set of websites. Using these techniques, we study shoppingwebsites, which often use dark patterns these to influence users into making more purchases or disclosingmore information than they would otherwise. Analyzing∼53K product pages from∼11K shopping websites,we discover 1,841 dark pattern instances, together representing 15 types and 7 categories. We examine theunderlying influence of these dark patterns, documenting their potential harm on user decision-making. Wealso examine these dark patterns for deceptive practices, and find 183 websites that engage in such practices.Finally, we uncover 22 third-party entities that offer dark patterns as a turnkey solution. Based on our findings,we make recommendations for stakeholders including researchers and regulators to study, mitigate, andminimize the use of these patterns.

Dark patterns [31,47] are user interface design choices that benefit an online service by coercing,steering, or deceiving users into making decisions that, if fully informed and capable of selectingalternatives, they might not make. Such interface design is an increasingly common occurrence ondigital platforms including social media [45] and shopping websites [31], mobile apps [5,30], and video games [83]. At best, dark patterns annoy and frustrate users. At worst, dark patterns userscan mislead and deceive users, e.g., by causing financial loss [1,2], tricking users into giving upvast amounts of personal data [45], or inducing compulsive and addictive behavior in adults [71]and children [20].While prior work [30,31,37,47] has provided a starting point for describing the types ofdark patterns, there is no large-scale evidence documenting the prevalence of dark patterns, or asystematic and descriptive investigation of how the various different types of dark patterns harmusers. If we are to develop countermeasures against dark patterns, we first need to examine where,how often, and the technical means by which dark patterns appear, and second, we need to be ableto compare and contrast how various dark patterns influence user decision-making. By doing so,we can both inform users about and protect them from such patterns, and given that many of thesepatterns are unlawful, aid regulatory agencies in addressing and mitigating their use.In this paper, we present an automated approach that enables experts to identify dark patternsat scale on the web. Our approach relies on (1) a web crawler, built on top of OpenWPM [24,39]—aweb privacy measurement platform—to simulate a user browsing experience and identify userinterface elements; (2) text clustering to extract recurring user interface designs from the resultingdata; and (3) inspecting the resulting clusters for instances of dark patterns. We also develop a noveltaxonomy of dark pattern characteristics so that researchers and regulators can use descriptive andcomparative terminology to understand how dark patterns influence user decision-making.While our automated approach generalizes, we focus this study on shopping websites. Darkpatterns are especially common on shopping websites, used by an overwhelming majority of theAmerican public [75], where they trick users into signing up for recurring subscriptions and makingunwanted purchases, resulting in concrete financial loss. We use our web crawler to visit the∼11Kmost popular shopping websites worldwide, and from the resulting analysis create a large data setof dark patterns and document their prevalence. In doing so, we discover several new instancesand variations of previously documented dark patterns [31,47]. We also classify the dark patternswe encounter using our taxonomy of dark pattern characteristics.

We have five main findings:

•We discovered 1,841 instances of dark patterns on shopping websites, which together repre-sent 15 types of dark patterns and 7 broad categories.

•These 1,841 dark patterns were present on 1,267 of the∼11K shopping websites (∼11.2%) inour data set. Shopping websites that were more popular, according to Alexa rankings [9],were more likely to feature dark patterns. This represents a lower bound on the number ofdark patterns on these websites, since our automated approach only examined text-baseduser interfaces on a sample of products pages per website.

•Using our taxonomy of dark pattern characteristics, we classified the dark patterns wediscover on the basis whether they lead to anasymmetryof choice, arecovertin their effect,aredeceptivein nature,hide informationfrom users, andrestrictchoice. We also map the darkpatterns in our data set to the cognitive biases they exploit. These biases collectively describethe consumer psychology underpinnings of the dark patterns we identified.

•In total, we uncovered 234 instances of deceptive dark patterns across 183 websites. Wehighlight the types of dark patterns we discovered that rely on consumer deception.

•We identified 22 third-party entities that provide shopping websites with the ability to createdark patterns on their sites. Two of these entities openly advertised practices that enabledeceptive messages


We developed a taxonomy of dark pattern characteristics that allows researchers, policy-makers and journalists to have a descriptive, comprehensive, and comparative terminology for understand-ing the potential harm and impact of dark patterns on user decision-making. Our taxonomy is based upon the literature on online manipulation [33,74,81] and dark patterns highlighted in previous work [31,47], and it consists of the following five dimensions, each of which poses a possible barrier to user decision-making:

•Asymmetric: Does the user interface design impose unequal weights or burdens on theavailable choices presented to the user in the interface3? For instance, a website may presenta prominent button to accept cookies on the web but hide the opt-out button in another page.

•Covert: Is the effect of the user interface design choice hidden from users? A websitemay develop interface design to steer users into making specific purchases without theirknowledge. Often, websites achieve this by exploiting users’ cognitive biases, which aredeviations from rational behavior justified by some “biased” line of reasoning [50]. In aconcrete example, a website may leverage the Decoy Effect [51] cognitive bias, in whichan additional choice—the decoy—is introduced to make certain other choices seem moreappealing. Users may fail to recognize the decoy’s presence is merely to influence theirdecision making, making its effect hidden from users.

•Deceptive: Does the user interface design induce false beliefs either through affirmativemisstatements, misleading statements, or omissions? For example, a website may offer adiscount to users that appears to be limited-time, but actually repeats when they visit the siteagain. Users may be aware that the website is trying to offer them a deal or sale; however,they may not realize that the influence is grounded in a false belief—in this case, becausethe discount is recurring. This false belief affects users decision-making i.e., they may actdifferently if they knew that this sale is repeated.

•Hides Information: Does the user interface obscure or delay the presentation of neces-sary information to the user? For example, a website may not disclose, hide, or delay thepresentation of information about charges related to a product from users.3We narrow the scope of asymmetry to only refer to explicit choices in the interface.

•Restrictive: Does the user interface restrict the set of choices available to users? For instance,a website may only allow users to sign up for an account with existing social media accountssuch as Facebook or Google so they can gather more information about them.
In Section 5, we also draw an explicit connection between each dark pattern we discover and thecognitive biases they exploit. The biases we refer to in our findings are:
(1)Anchoring Effect [77]: The tendency for individuals to overly rely on an initial piece ofinformation—the “anchor”—on future decisions
(2)Bandwagon Effect [72]: The tendency for individuals to value something more because othersseem to value it.
(3)Default Effect [53]: The tendency of individuals to stick with options that are assigned tothem by default, due to inertia in the effort required to change the option.
(4)Framing Effect [78]: A phenomenon that individuals may reach different decisions from thesame information depending on how it is presented or “framed”.
(5)Scarcity Bias [62]: The tendency of individuals to place a higher value on things that arescarce.
(6)Sunk Cost Fallacy [28]: The tendency for individuals to continue an action if they haveinvested resource (e.g., time and money) into it, even if that action would make them worse off.
We discovered a total of 22 third-party entities, embedded in 1,066of the 11K shopping websites in our data set, and in 7,769 of the Alexa top million websites. Wenote that the prevalence figures from the Princeton Web Census Project data should be taken as a

24A. Mathur et al.lower bound since their crawls are limited to home pages of websites. […] we discovered that many shopping websites only embedded them intheir product—and not home—pages, presumably for functionality and performance reasons.

Many of the third-parties advertised practices that appeared to be—and sometimes unambiguouslywere—manipulative: “[p]lay upon [customers’] fear of missing out by showing shoppers whichproducts are creating a buzz on your website” (Fresh Relevance), “[c]reate a sense of urgency toboost conversions and speed up sales cycles with Price Alert Web Push” (Insider), “[t]ake advantageof impulse purchases or encourage visitors over shipping thresholds” (Qubit). Further, Qubit alsoadvertised Social Proof Activity Notifications that could be tailored to users’ preferences andbackgrounds.
In some instances, we found that third parties openly advertised the deceptive capabilities of theirproducts. For example, Boost dedicated a web page—titled “Fake it till you make it”—to describinghow it could help create fake orders [12]. Woocommerce Notification—a Woocommerce platformplugin—also advertised that it could create fake social proof messages: “[t]he plugin will create fakeorders of the selected products” [23]. Interestingly, certain third parties (Fomo, Proof, and Boost)used Social Proof Activity Messages on their own websites to promote their products.
These practices are unambiguously unlawful in the United States(under Section 5 of the Federal Trade Commission Act and similar state laws [43]), the EuropeanUnion (under the Unfair Commercial Practices Directive and similar member state laws [40]), andnumerous other jurisdictions.We also find practices that are unlawful in a smaller set of jurisdictions. In the European Union,businesses are bound by an array of affirmative disclosure and independent consent requirements inthe Consumer Rights Directive [41]. Websites that use the Sneaking dark patterns (Sneak into Basket,Hidden Subscription, and Hidden Costs) on European Union consumers are likely in violation ofthe Directive. Furthermore, user consent obtained through Trick Questions and Visual Interferencedark patterns do not constitute freely given, informed and active consent as required by the GeneralData Protection Regulation (GDPR) [42]. In fact, the Norwegian Consumer Council filed a GDPRcomplaint against Google in 2018, arguing that Google used dark patterns to manipulate usersinto turning on the “Location History” feature on Android, and thus enabling constant locationtracking [46

Source: Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites Draft: June 25, 2019 – dark-patterns.pdf