Lawmakers are getting wise to online companies’ manipulative user interface design practices. Congressional leaders in the US unveiled a new law this week to ban the use of ‘dark patterns’ by large online players.
What are these dark patterns? Senator Mark Warner, one of the Act’s sponsors, describes them as design choices based on psychological research. They are…
…frequently used by social media platforms to mislead consumers into agreeing to settings and practices advantageous to the company.
Warner’s Deceptive Experiences To Online Users Reduction (DETOUR) Act makes it illegal for online companies with over 100 million users to design interfaces that aim at:
Obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.
What kinds of techniques are we talking about, and what decisions do they coerce users into making?
The website darkpatterns.org, created by user experience consultant Harry Brignull, calls out several kinds of manipulative user interface behaviours with some delightful names.
These include confirmshaming. This guilts the user into opting into something. You’ll have seen this on some passive-aggressive websites that try to make you sign up for mailing lists. Instead of just offering a ‘No’ option, they’ll say something like “no, I don’t want to stay abreast of current industry trends”.
Other examples include Privacy Zuckering, which trick users into publicly sharing more information about themselves than they wanted to. Guess who it’s named after?
Another, the Roach Motel, is an interface that makes it easy for you to sign up for something, but buries the option to leave on an obscure part of the site, or makes you speak to a human operator.
These design techniques can also steer users into giving up their privacy rights, which is something that regulators in Europe have been upset about. Last June, the Norwegian Consumer Council published a report called Deceived by Design. It attacked Facebook and Google for manipulating users to give up the privacy options granted to them by GDPR.
The proposed legislation would make large online companies tell users at least once every 90 days if they are experimenting with interfaces designed to promote engagement or “product conversion”, which typically means encouraging a purchase.
Any such experiments would also have to be approved by an independent review board registered with the Federal Trade Commission.
The Act also takes a stab at preventing self-regulation by imposing strict rules around the formation of professional standards bodies. The worry here seems to be that large online companies could otherwise create a professional standards body themselves. They could then use it to create their own guidelines for user interface design.
The legislation forces any such professional standards body created by industry to have at least one director representing the users, rather than the online companies that created it. It would also need explicit rules to prevent manipulative interface design.
One notable inclusion in the Act protects children. Under the legislation, it would be unlawful to craft user interfaces…
…with the purpose or substantial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user.
These rules represent a firm push back against manipulative practices by large online companies to steer their users down certain paths. If the Act passes into law, it’ll be interesting to see how forcefully it’s applied.