Conceptual illustration shows a person holding up a phone, only to be surrounded by watchful eyes

Credit: DrAfter123/Getty images

Economics

Can online retailers guess how much consumers are willing to pay for their products?

What prompted economist Itay Fainmesser and two colleagues to put together their latest research paper is the prevalence of the belief that, when it comes to online data collection, companies know everything about everybody and potentially use that information to take advantage of customers. "That includes how much I'm willing to pay for a product, so they'll make me pay that," says Fainmesser, an associate professor of economics at Johns Hopkins Carey Business School.

Makes sense; we've been hearing about online privacy abuse for years. In 2018, for example, news broke that Cambridge Analytica had utilized the personal data of millions of Facebook users without their consent for targeted political ads. And a decade ago, a few online travel brands tailored prices for flights and hotels based on users' search tools—Mac versus PC, Windows versus iOS—the idea being that they reflect financial worth. "That's old-school profiling based on very little information," Fainmesser says. Today, he adds, companies use "much richer datasets with artificial intelligence capabilities."

For their new paper "Consumer Profiling via Information Design," Fainmesser partnered with Andrea Galeotti of the London Business School and Ruslan Momot of the University of Michigan Ross School of Business to test their hypothesis with a theoretical model. That model assumes that "marketplace platforms"—sites like eBay, StubHub, and Airbnb that do not sell their own products but rather act as intermediaries between buyers and sellers—share with third-party sellers the purchase histories of users, along with background information gleaned from their accounts. That includes age, ethnicity, even income, "which they can estimate without a pay stub," Fainmesser says.

Stubhub might share with ticket resellers your income and how much you paid to see comedian David Cross perform stand-up last year, but that doesn't offer sufficient information on what you'd actually be willing to pay.

Reassuringly, the researchers found that this kind of consumer profiling is more likely to be beneficial than harmful to would-be purchasers. Because sellers do not know what buyers might be willing to pay for products they've always purchased at the prices posted, the sellers are at a disadvantage. For example: Stubhub might share with ticket resellers your income and how much you paid to see comedian David Cross perform stand-up last year, but that doesn't offer sufficient information on what you'd actually be willing to pay—you might be willing to pay much more than that, but it is risky for the seller to price too high because you might end up not purchasing from them. One big benefit for buyers is that the information shared by platforms can include data on what is looked at but not purchased. "It's saying, in essence, 'That price didn't work, you might want to lower it,'" Fainmesser explains.

The paper does, however, issue a warning. Regulators, they write, should be wary of any platform, including those selling their own products, that can directly set prices. Such platforms may have long-term incentives and may be willing to price higher in order to learn more about customers' preferences, even at the cost of losing a sale today. "If they do that," Fainmesser says, "they can learn more and appropriate more value from consumers."

This includes platforms that sell their own products as well as hybrid platforms like Amazon, which not only sell their own products but also act as intermediaries. Because they set prices as well, the question is whether they're using personal data to tailor prices—for example charging one customer $350 and another $500 for the same product. "We know they have the technology to do that, but we don't know if it's happening," Fainmesser says.

Consumer privacy concerns are not limited to the possibility of targeted pricing. In its 2022 paper, "Digital Privacy," the same team of researchers studied the dangers stemming from data getting into the hands of third parties.

"The more data that's available on these platforms, the more hackers or adversaries will come looking for it," Fainmesser says. "And the more they come, the less users will want to use the platform."

Soon, Fainmesser says, AI could enable adversaries like hackers, identity thieves, and authoritarian governments to reach out in more familiar ways. "If there's lots of information about us available, and strong tools to use it, a machine can behave as if it knows me," he explains. And, indeed, many already do. Scam artists—using AI and posing via phone, email, and text as everything from bank employees to law enforcement officers—steal billions of dollars from U.S. consumers every year.

So what's the solution? Vigilance is beneficial, like knowing which streets are well lit when traveling after dark, Fainmesser says. But keeping the streets safe for everyone is ultimately the government's responsibility. In "Digital Privacy," the paper's authors recommend a two-pronged policy: requiring that platforms provide a minimum level of data protection and then either fining those that allow data breaches to occur or directly taxing data collection.

Academics share such findings with regulators to help them stay abreast of AI and potential adversaries. "Big tech companies have tons of resources, and they move ahead in areas that aren't regulated," Fainmesser explains. "Policymakers want to figure out what they should and shouldn't allow. We're working together on that, which is encouraging."