project[2]

lim jia sheng,
0344034.

BDCM
.Design Research & Methodology
::project[2]






project[2]: Critical Review

todo:

  • Find 5 research articles.
  • Gather notes on them.
  • Write critical review summarizing all 5 articles.

research:

The 5 articles I found were:

  1. Day, G., & Stemler, A. (2020). Are dark patterns anticompetitive? Alabama Law Review, 72(1), 1–46. HeinOnline. http://heinonline.org/HOL/Page?handle=hein.journals/bamalr72&div=5
  2. Hausner, P., & Gertz, M. (2021, May 8). Dark patterns in the interaction with cookie banners. CHI Conference on Human Factors in Computing Systems. https://dbs.ifi.uni-heidelberg.de/files/Team/phausner/publications/Hausner_Gertz_CHI2021.pdf
  3. Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., & Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–32. https://doi.org/10.1145/3359183
  4. Mathur, A., Kshirsagar, M., & Mayer, J. (2021). What makes a dark pattern... dark? CHI Conference on Human Factors in Computing Systems (CHI ’21), 1–18. Association for Computing Machinery. https://doi.org/10.1145/3411764.3445610
  5. Narayanan, A., Mathur, A., Chetty, M., & Kshirsagar, M. (2020). Dark patterns: Past, present, and future. Communications of the ACM, 63(9), 42–47. https://doi.org/10.1145/3397884

notes:

(Hausner & Gertz, 2021)

Main issues raised by writer:

  • Majority of cookies nowadays are used to track users for targeted ads instead of providing beneficial functionality to said users (Urban et al., 2020, as cited in Hausner & Gertz, 2021).
  • Most users do not want to be tracked this way, but are coerced into consenting to it anyways using dark patterns (Kulyk et al., 2018, as cited in Hausner & Gertz, 2021).
  • The EU introduced GDPR to attempt to counteract such issue by requiring explicit consent for cookie storage (Regulation (EU) 2016/679 of the European parliament and of the council, 2016, as cited in Hausner & Gertz, 2021). However, their implementation of legislation resulted in consent forms that ranged wildly in terms of quality & user friendliness.

Major interpretations:

  • This kind of manipulation can be described as using dark patterns.
  • Dark patterns can be interpreted as "an instance of interface interference, particularly a case of aesthetic manipulation".

Biasness:

  • The author seems to argue for the consumer over the entities that are implementing dark patterns against said consumers.

How it relates to other literature/my own experience:

  • It ties in other literature like "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites"
  • In my experience, traversing through DOM nodes manually & running through non-learning algorithms is enough to detect "clickables" of interest; finding their maliciousness shouldn't require neural networks as their attributes (both HTML & CSS) are known & unobfuscated at runtime, enabling us to simply reconstruct subsets of pages & cluster by similarities of their (relatively) small amount of attributes.

General critique:

  • Their working definition of a dark pattern by Gray et al. is very broad, marking any instance of "visual manipulation" as a dark pattern without acknowledging how these general UI patterns can be used with no malice/inline with users' interests. eg. a close button marked red, an emphasized title link.

  • It makes the statement of "it is clear that purely rule-based methods are not sufficient to tackle the problem of dark pattern detection" with no evidence to back it up. There are already sophisticated rule-based systems for things like ad/tracker filtration & website theme inversion.

  • It also doesn't elaborate on certain items like what "certain keywords" they use, as well as the "heuristics" in order to make the results of this article reproducible.

[2]: Dark Patterns: Past, Present, and Future

(Narayanan et al., 2020)

Main issues raised by writer:

  • Companies & governments are using dark patterns to manipulate their userbase/citizens into performing/consenting to certain actions/agreements (Elliott & Waldron, 2019, as cited in Narayanan et al., 2020, Federal Trade Commission, 2019, as cited in Narayanan et al., 2020, Venkatadri et al., 2019, as cited in Narayanan et al., 2020).

Major interpretations:

  • Dark patterns are simply a continuation of normalized hostile practices in various sectors — retail's deceptive practices, research/public policy's nudging, & design's growth hacking.
  • Nudging has been accepted widely outside of dark patterns, which poses a challenge for intervention as users might be complacent of malicious practices.
  • Growth hacking's use of dark patterns can be direct (to grow) or indirect (to squeeze money from users after the growth spirt).
  • Data driven UI optimization via A/B testing, which is commonplace in the current web design landscape, makes creating UIs with dark patterns easier.

Biasness:

  • The article seems relatively unbiased, taking the stance of the designer, user, & employer, aiming to create a world where all three of their interests intersect.

How it relates to other literature/my own experience:

  • It bases itself off of concepts clearly defined in literature from other fields, these concepts mirror that of dark patterns in inherent functionality but not environment.

General critique:

  • Puts the responsibility of avoiding dark patterns on the designer themselves, even though the incentive/motive for dark patterns themselves are not directly one of the designer.
  • This article attempts to correlate Growth Hacking with dark patterns, however does not make justifications on how growth hacking is different from other business strategies, such as gamification, which may employ the same manipulation tactics.
  • It noted Google's change of ad labels A/B test as an example against A/B testing, however one could argue that that is a perfect example of A/B testing working well the organization may receive responses from diverse chunks of users, and subsequently iterate on their product using either the backlash or the good remarks received without affecting an entire userbase.

[3]: What Makes a Dark Pattern... Dark?

(Mathur et al., 2021)

Main issues raised by writer:

  • Occurrences of dark patterns are frequent in contexts such as privacy settings, online gaming, & online shopping (Bösch et al., 2016, as cited in Mathur et al., 2021, Zagal et al., as cited in Mathur et al., 2021, 2013, Mathur et al., 2019 as cited in Mathur et al., 2021).
  • Currently in the academic field though, the foundation where articles are written mentioning dark patterns is lacking, as while said articles describe designs their authors are familiar with clearly, they fail to draw the line on where UI becomes a dark pattern & what makes such dark patterns problematic.

Major interpretations:

  • It distils dark patterns to be patterns that affect how users make their choice. This was gotten through a comparison between dark pattern definitions from various sources. It looked at the specific terms used for each definition & categorizes them; it looked at the types described, how they're collected, & notes how their methodologies are flawed; it split each dark pattern into several attributes that affect "choice architecture".
  • After all that, it conceptualizes a set of themes that describe each type of dark pattern.
  • It mirrors dark patterns to multiple other sectors, including the nudge & sludge in behavioural economics which is defined as initiatives that steer people while still allowing them to go their own way, the subversion of decision making through "invisible" manipulation in philosophy/ethics, as well as how marketers skirt the law & exploit non-rational behaviour of humans to sell products.
  • It attempts to define dark patterns normatively through multiple lenses — individual welfare where a dark pattern would be any pattern that harmed an individual whether monetarily or otherwise, collective welfare where a dark pattern would be any pattern that harmed society or a group of individuals, regulatory where a dark pattern would be any pattern that nudged users to undermine certain regulatory objectives, & individual autonomy where a dark pattern would be any pattern that influences a user's decision process in any way.
  • It lists down surveys, ethnography, lab/field studies, field deployments, & experiments as empirical ways to measure dark patterns, along with methods to use each of those ways to measure through each of the lenses mentioned above.

Biasness:

  • The article seems to be relatively unbiased, speaking from all fields & only about the topic at hand without trying to argue for or against any party.

How it relates to other literature/my own experience:

  • The article takes a lot from other literature & digests them into one source. It acknowledges all their flaws as well as their stand out points, using its interpretations from them to establish its own definition.
  • Mathur also refers to his own previous article "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites" for a basis on how dark patterns are used & their prevelance.

General critique:

  • In describing how to apply the discussed methods to analyze dark patterns, the article mentions "neutral designs" & "baselines" but fails to provide any method to derive such design from one with dark patterns. If the design is synthetic, how would one know the dark pattern aspect is removed (as one would need to know if there's a dark pattern in the first place to be able to remove it); if the design is from the real world, how would one know if other dark patterns aren't being applied that might influence the result (if we use Twitter's UI to evaluate Facebook's, who's to evaluate Twitter's?)
  • The article goes on to attempt to resolve this by specifying to use multiple variations from the dark pattern bearing design, but what attribute to vary, how one would determine if the varied attribute has contributed any side-effects, or what if varying an attribute produces a result even more in the realm of a dark pattern than the initial subject, are questions not addressed.

[4]: Are Dark Patterns Anticompetitive?

(Day & Stemler, 2020)

Main issues raised by writer:

  • Current legislative protection against anticompetitive behaviours do not prohibit dark patterns; they actively label it as procompetitive as they view marketing (whether deceptive or not) & pushing a product to the users as part of competition.
  • Such laws also only focus on economic harms done to the consumer, while disregarding the premiere online currency of privacy & attention sought after by "free" platforms (Lao, 1997, as cited in Day & Stemler, 2020).
  • Tech giants have used their advantage in amount of information collected to influence the choice of their users (McGill, 2019, as cited in Day & Stemler, 2020).

Major interpretations:

  • Dark patterns creates anticompetitiveness by providing large companies more data (via captivating attention & subsequently influencing a user's behaviour to expose more of their private info) that can be analyzed & subsequently used to further ascend over their competitors, which may not have as much data due to it being a smaller size/having a small userbase; a positive event loop
  • Dark patterns (eg. Privacy Zuckering) take advantage of System I decision making, which produces hasty results, akin to "how an American would absentmindedly look the wrong way when crossing into London traffic" (Kahneman, 2003, as cited in Day & Stemler, 2020).
  • A price-focused anticompetitive legislative framework causes exclusionary conduct by "free" services to fall under legal grounds
  • Non-price injuries caused by such "free" platforms are hard to prove too as users are generally satisfied with said platforms. (Manne & Sperry, 2015, as cited in Day & Stemler, 2020)
  • Digital manipulation via coercion instead of merely persuasion is considered anticompetitive.
  • "one’s attention is finite, digital manipulation can erect barriers to entry where consumers compulsively use a platform to the exclusion of upstarts; this can hook users onto the platform even though superior interfaces exist".
  • Digital manipulation as a result of dark patterns entail the manipulation of a user's usage of a platform while providing them a qualitatively worse product.
  • Both antitrust laws & the FTC's Section 5 powers may act as powerful tools to fight against dark patterns & promote decisional privacy. However, antitrust laws achieves it by forcing competition, whilst consumer protection laws achieves it regardless of competition in the space.

Biasness:

  • It argues against corporations' tactics & attempts to use existing policies as well as create new ones to take them down, whilst not purposely appealing to any single party.

How it relates to other literature/my own experience:

  • It uses quite a lot of citations from various industry & sources, including law, economics, psychology, & HCI.

General critique:

  • This article includes a lot of information, however some seem to swerve far away from the intended topic without providing significant support for their intended topics — talking about a merger policy for privacy being a currency incentivising dark patterns, Actavis's anticompetitive behaviour for coercion the key for dark patterns being anticompetitive.
  • It reads extremely scattered, trying to cover way too many aspects at once — digital manipulation, monopolistic behaviours, attention driven addiction, privacy.

[5]: Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites

(Mathur et al., 2019)

Main issues raised by writer:

  • There exists no large-scale evidence for dark patterns' reach in modern UI design.
  • There exists no investigation into how different types of dark patterns cause harm to users.

Major interpretations:

  • The majority of impulse-buying-facilitating elements on a page do not constitute as dark patterns.
  • The "primary interaction path" (eg. the path the user has to take to buy an item on an e-commerce site), is where dark patterns are most prominent.
  • Many instances of dark patterns contain similar textual content (eg. confirm-shaming patterns often begin with "no thanks").
  • 1,818 instances of dark patterns from 1,254 (a total of ~11.1%) websites in the sieved dataset of 11K shopping websites.
  • 7 instances of the "Sneak into Basket" dark pattern was found — when an additional product is added to the cart without a user's consent or when a checkbox is ticked by default. It is only partially deceptive but fully information hiding.
  • 5 instances of the "Hidden Costs" dark pattern was found — when a website adds "service fees" or "handling fees" only revealed at checkout. It is only partially deceptive but fully information hiding.
  • 14 instances of the "Hidden Subscription" dark pattern was found — often appearing next to the "Hard to Cancel" pattern, is when a reoccurring fee is charged under the disguise of a one-time payment/free trail, with the user not being aware of it until some time later. It is only partially deceptive but fully information hiding.
  • 393 instances of the "Countdown Timer" dark pattern was found — when a dynamic indicator of "how much time is left" is shown for an offer or sale. It is at least both partially covert & deceptive.
  • 88 instances of the "Limited-time Message" dark pattern was found — when a static indicator with no set timeline is shown for an offer or sale. It purposely withholds information from the user & is also partially covert.
  • 169 instances of the "Confirmshaming" dark pattern was found — when language is used that causes the user to feel "guilty" for making a certain choice. It portrays fully the attribute of asymmetry.
  • 25 instances of the "Visual Interference" dark pattern was found — when visual elements are used to manipulate a user away from a choice, by either making a certain choice appear more appealing, or to inflate an offer. It is partially asymmetric, partially deceptive, but fully covert.
  • 9 instances of the "Trick Questions" dark pattern was found — when confusing language is used to manipulate a user away from making a certain choice. It is fully asymmetric & fully covert.
  • 67 instances of the "Pressured Selling" dark pattern was found — when defaults or high-pressure tactics (anchoring, scarcity bias) are used to sell a user additional products/services. It is partially asymmetric, & partially covert.
  • 313 instances of the "Activity Message" dark pattern was found — when a, commonly persistent & attention grabbing, notification, either dynamic or static, exists on the site to indicate other users' activity. It is partially covert & partially deceptive.
  • 12 instances of the "Testimonials" dark pattern was found — when testimonials from previous users, where origin cannot be objectively determined, are used to manipulate new users. It is partially deceptive.
  • 632 instances of the "Low-stock Message" dark pattern was found — when a message is shown signalling to the user that stock on an item might be low/limited, increasing its desirability. It is partially covert, partially deceptive, & partially information hiding.
  • 47 instances of the "High-demand Message" dark pattern was found — when a message is shown signalling to the current user that a specific item is of high desirability to other users. It is partially covert.
  • 31 instances of the "Hard to Cancel" dark pattern was found — similar to the "Roach Motel" dark pattern, is when an obstruction prevents a user from cancelling a subscription/membership easily, even though signing up was simple. It is restrictive & partially information hiding.
  • 6 instances of the "Forced Enrollment" dark pattern was found — when users are forced to perform a side effect to complete their main goal. It is asymmetric & restrictive.
  • The by far most common offender being the "Low-stock Message" at 632 instances out of 1,254 instances of dark patterns (~34.7%).
  • There are entities providing dark patterns as a service, with 1,069 out of 1,254 websites using them to some capacity (~85.2%).

Biasness:

  • It was fair by not swaying to any party & only referring to data gathered on its own.

How it relates to other literature/my own experience:

  • It is the basis for the definitions used in "What Makes a Dark Pattern... Dark" (Mathur et al., 2021) & is cited by 92 other articles for its findings & taxonomy (ResearchGate, n.d).
  • Its use of OpenWPM & subsequently Selenium was probably a good choice for browser control, as it enabled them to "act" as regular users via fingerprint spoofing & thus bypass many of bot-detection frameworks in place in e-commerce sites (Goßen & Hugo Jonker, 2020).
  • It used clustering ML algorithms like in Dark Patterns in the Interaction with Cookie Banners (Hausner & Gertz, 2021), however it only uses it for data gathering & puts the bulk of the actual dark pattern analysis & detection on human experts.

General critique:

  • Its reliance on humans to detect dark patterns may create misses or false positives, as they're going after their subjective view of certain such patterns.
  • It is not clear what the authors did with the clustered data, as it would seem that only textual data was all that remained from the initial websites, however the analysis describes categorization of dark patterns based on style & research based on the actual flow of the website.

feedback:

  • 30/9/2021
    • We do not need to cross-compare, cite, & derive subtopics at this stage of research; mainly we’re judged on how well we understand the content.
    • Our critical review should note for each article, its relevance, its findings, & its critiques.
    • If there are figures that may benefit my own research topic I may use with citation.
    • Our final submission will be one report on 5 of the articles in relative isolation (one subgroup per article, instead of subgroups between articles)
  • 7/10/2021
    • I should add the references to the introduction heading.
    • Critique length is not as important as critique content; as long as the critique looks through things like logical flaws & refers to past experiences it is good.
    • Some articles are straightforward & leave space for interpretation.

final:

Figure 1.1.1, Final critical review, 10/10/2021

reflection:

Reading through all those articles was tedious at times, but they contained really cool information, such as ones relating to law & economics. It gave me the opportunity to explore new knowledge I'd never have the chance to. After absorbing everything, we'd then argue against parts of them, addressing any good & bad while providing solutions. This gave a sense of progression as the act of finding holes in research & filling them is inherently helpful to the entire research community.

The main thing I observed through this project was how articles were structured & how the style of writing differed from regular readable material. They used a lot of objective & passive language while also managing to mix in their own substantiated opinion. This is a balance I'd definitely have to work to strive for but I know one day it'll come more naturally. I also observed that there was a lot of types of citations, & they all unfortunately don't fit together perfectly. I worked around it by scraping my own information from the papers rather than relying on what was provided.

At the end of the day, this writing of the critical review taught me a lot & I'll be able to use what I've learnt through it via my own writings or simply in deducing the arguments of others & being critical. It really brings a science aspect to design, demystifying its unknowns.

Comments