Senior and student discounts don’t tend to be controversial, but mortgage concessions that favor males over females, “pink taxes” on women’s personal care itemsand dating apps that charge older users higher fees can spark outrage among consumers.
Price discrimination, when companies charge different prices for different consumers, is widespread, with artificial intelligence making it easier than ever for companies to determine who is willing to pay more.
A paper recently published in the Journal for the Association of Consumer Research examined consumer perceptions of demographic-based price discrimination, “a common but controversial pricing practice known to elicit considerable backlash,” according to the researchers.
It found that consumers often feel exploited when companies charge different prices based on data like age and gender, but they’re more tolerant of the practice if they realize an algorithm manipulated the pricing.
“People feel less judged when they find out it’s an algorithm versus a human. Algorithms are not trying to exploit you. They see it more as just math, not personal,” said Alix Barasch, associate marketing professor in the Leeds School of Business, who co-authored the paper with , an assistant professor at the University of Southern California’s Marshall School of Business, and , a business and marketing professor at Columbia Business School.
The paper describes consumer backlash against Apple in 2019, when it came to light that its Apple Card offered smaller lines of credit to women than men. Allegations of gender discrimination blew up on Twitter when a tech entrepreneur tweeted that Apple Card offered him a credit limit 20 times greater than his wifealthough they filed joint tax returns and she had a higher credit score. Other users voiced similar experiences.
When his wife complained to Apple’s customer service, according to his tweet, the representative replied: “I swear we’re not discriminating; it’s just the algorithm.” Following an investigation, New York’s State Department of Financial Services found that Apple’s banking partner, Goldman Sachs, did not discriminate based on sex.
Despite the ruling, the researchers felt the incident raised important questions about how companies might benefit from disclosing the role algorithms play in setting prices.
In one study, the researchers asked more than 350 participants to imagine searching for an insurance policy in which they encountered an article revealing that the insurer’s pricing method changed prices based on consumers’ personal characteristics, such as age, gender and sociobabyֱapp status.
Some participants learned that a sales manager determined prices, while others were told an automated algorithm set prices. All were asked to report how likely they were to purchase from the insurance provider and rate their perceived price fairness.
The researchers found that demographic price discrimination was seen as more fair when participants believed algorithms were responsible for setting prices, andthey were more likely to make a purchase in these cases.
Gut reactions to demographic discrimination
While consumers view algorithmic pricing as more justified when it comes to demographic price discrimination, the researchers found that the same doesn't hold true when companies use algorithms to charge different prices based on factors like time and demand (such as surge pricing for ride-hailing apps) or when algorithms aim to promote a positive social outcome (such as charging babyֱappally disadvantaged customers less).
“People are still upset about demographic discriminationeven if it's pro-social,” Barasch said.
“The gut reactions that we have are really powerful, and sometimes we find it hard to moderate the effect because it concerns perceptions of fairness. People react really strongly, and it's hard to turn that off,” she said.