How E-Commerce Sites Manipulate You Into Buying Things You May Not Want

When potential customers visit the online resale store ThredUp, messages on the screen regularly tell them just how much other users of the site are saving.

“Alexandra from Anaheim just saved $222 on her order” says one message next to an image of a bright, multicolored dress. It’s a common technique on shopping websites, intended to capitalize on people’s desire to fit in with others and to create a “fear of missing out.”

But “Alexandra from Anaheim” did not buy the dress. She does not exist. Instead, the website’s code pulled combinations from a preprogrammed list of names, locations and items and presented them as actual recent purchases.

The fake messages are an example of “dark patterns,” devious online techniques that manipulate users into doing things they might not otherwise choose to. They are the digital version of timeworn tactics used to influence consumer behavior, like impulse purchases placed near cash registers, or bait-and-switch ads for used cars.

Sometimes, the methods are clearly deceptive, as with ThredUp, but often they walk a fine line between manipulation and persuasion: Think of the brightly colored button that encourages you to agree to a service, while the link to opt out is hidden in a drop-down menu.

Web designers and consumers have been highlighting examples of dark patterns online since Harry Brignull, a user-experience consultant in Britain, coined the term in 2010. But interest in the tools of online influence has intensified in the past year, amid a series of high-profile revelations about Silicon Valley companies’ handling of people’s private information. An important element of that discussion is the notion of consent: what users are agreeing to do and share online, and how far businesses can go in leading them to make decisions.

The prevalence of dark patterns across the web is unknown, but in a study released this week, researchers from Princeton University have started to quantify the phenomenon, focusing first on retail companies. The study is the first to systematically examine a large number of sites. The researchers developed software that automatically scanned more than 10,000 sites and found that more than 1,200 of them used techniques that the authors identified as dark patterns, including ThredUp’s fake notifications.

The report coincides with discussions among lawmakers about regulating technology companies, including through a bill proposed in April by Senators Deb Fischer, Republican of Nebraska, and Mark Warner, Democrat of Virginia, that is meant to limit the use of dark patterns by making some of the techniques illegal and giving the Federal Trade Commission more authority to police the practice.

“We are focused in on a problem that I think everyone recognizes,” said Ms. Fischer, adding that she became interested in the problem after becoming annoyed in her personal experience with the techniques.

The legislation faces uncertain prospects, in part because of language defining dark patterns and the companies that would be subject to the new law that is ambiguous, said Woodrow Hartzog, a law and computer science professor at Northeastern University. Still, he added, it is an important first step for policymakers in discussing dark patterns.

“The important question as a policy matter is what separates a dark pattern from good old-fashioned advertising,” he said. “It’s a notoriously difficult line to find — what’s permissible persuasion vs wrongful manipulation.”

The Princeton study identified dark-pattern techniques across the web by automatically scanning the sites’ text and code.

On ThredUp, for example, the researchers saw the website create the messages in April using code that arbitrarily selected combinations from a list of 100 names, 59 locations and 82 items. The New York Times replicated the results. On one day this month, the code led to messages in which “Abigail from Albuquerque” appeared to buy more than two dozen items, including dresses in sizes 2, 4, 6 and 8. On other occasions, it yielded messages showing different people “just” buying the same secondhand item days or months apart.

When asked about the notices, a ThredUp spokeswoman said in an emailed statement that the company used “real data” and that it included the fake names and locations “to be sensitive to privacy.” When asked whether the messages represented actual recent purchases, the company did not respond.

The number of sites the researchers found using dark patterns underestimates the techniques’ overall prevalence online, said Arunesh Mathur, a Princeton doctoral student and an author of the paper. The researchers’ software focused on text, and it scanned only retail stores’ pages and not travel booking sites, social media services or other areas where such tactics might be used. The study, he added, was also confined to patterns used to influence purchasing behavior, not data-sharing or other activities.

More than 160 retail sites used a tactic called “confirmshaming” that requires users to click a button that says something like “No thanks! I’d rather join the ‘Pay Full Price for Things’ club” if they want to avoid signing up or buying something.

More than two dozen sites used confusing messages when encouraging users to sign up for emails and other services. On a New Balance athletic apparel site, for instance, the first part of one message suggested that a user could check a box to receive emails, but on closer reading, the opposite was true. “We’d love to send you emails with offers and new products,” it said, “but if you do not wish to receive these updates, please tick this box.”

New Balance believes the opt-out is “legally compliant and we believe clear to consumers,” Damien Leigh, senior vice president of global direct-to-consumer sales for the company, said in a statement. But he added that the company “is always looking for ways to be as transparent as possible with consumers and will evaluate the study’s insight when it is released.”

About 30 sites made it easy to sign up for services but particularly hard to cancel, requiring phone calls or other procedures. The Times requires people to talk with a representative online or by phone to cancel subscriptions, but the researchers did not study it or other publishing sites.

Most sites identified by the researchers used messages that indicated that products were popular, that there were few items in stock or that products would only be available for a limited time. Some were demonstrably false, while others were unclear.

There is disagreement about whether messages about things like high demand constitute a dark pattern if they are truthful. But even those based on actual site activity are an attempt to play on consumers’ known weaknesses, said Arvind Narayanan, a Princeton computer science professor and an author of the paper.

“We are not claiming that everything we categorize in the paper should be of interest to government regulators,” he said. “But there should at least be more transparency about them so that online shoppers can be more aware of how their behavior is being nudged.”

Source link