[ad_1]
Receive free Artificial intelligence updates
We’ll send you a myFT Daily Digest email rounding up the latest Artificial intelligence news every morning.
Put your hand up if you have looked at a credit card statement recently and spotted a charge for a subscription that you had forgotten signing up for.
You’re not alone. The number of new subscriptions per US consumer peaked last year and cancellations are now outpacing new sign-ups. But for many services, getting out can be a lot more complicated than getting in, as I discovered when I tried to end my monthly payment to Amazon’s Audible recorded books membership.
If I cancelled, the app warned, I would lose the three book credits that I have already paid for but not used. Instead, it touted a “pause” button that would put off the next payment for three months. Not wanting to set that money on fire, I dutifully obliged and set a calendar reminder to cancel in October.
The UK Financial Conduct Authority has been worried about abusive online sale techniques for nearly a decade. It first warned against forcing customers to untick boxes or otherwise “opt out” to avoid paying for add-on insurance in 2015 and has expanded its efforts from there.
Now regulators have stepped up their war on what they call “dark patterns” — menu after menu of confusing options aimed at maximising spending and deterring cancellations. That’s needed, as companies use data mining, algorithms and sophisticated artificial intelligence to suck customers in and keep them sweet.
The US Federal Trade Commission sued Amazon last week, contending that it had “tricked and trapped people into recurring subscriptions” to its Prime service. It followed up this week by making Publishers Clearing House repay $18.5mn to customers who had been lured into making purchases and paying fees while entering its sweepstakes.
“We are really trying to send a message that these types of practices have caught our attention and will not be tolerated,” a senior FTC official told me.
In the EU, pressure from Brussels led Amazon to begin allowing customers to end their Prime subscription with just two clicks using a clearly labelled “cancel” button. It also changed its UK policies around that time, but only altered US cancellations this year, ahead of the FTC lawsuit. The company, which plans to fight the case, insists that its cancellation procedures are “clear and simple . . . by design”.
Even as regulators crack down on dark pattern website processes, companies are turning to artificial intelligence to make their sales methods more sophisticated, and potentially exploitative.
Businesses will soon be able — if they cannot already — to predict not just what to offer but also what time of day a purchase is most likely. Real-time emotion-sensing technology could be used to press offers at vulnerable moments.
Just as predictive autofill inserts errors into texts, a sophisticated generative AI could search for and pre-fill purchase information in ways that benefit sellers rather than offering buyers a full range of choices.
Imagine telling the AI to look for information on robot mops. If it came back with the sign-up page for a one-year subscription to a consumer site with the details filled in, some people would just click yes. They might then miss out on a free trial.
“The more versatile AI becomes, the more we need regulation . . . to make sure that it doesn’t manipulate us and doesn’t exploit us,” says Matthias Holweg of Oxford’s Saïd Business School.
Regulators need to set principles and procedures now that will make it easier to crack down on abusive sales practices as they emerge. At least three different approaches have merit.
The FCA’s new consumer duty, which comes into effect next month, specifically warns companies against trying “to exploit the behavioural biases of consumers . . . to create a demand”.
The European parliament is working on new legislation to put limits on the use of AI technologies such as biometric categorisation, emotion recognition and generative systems. Much of the focus so far has been focused on employment and law enforcement. But the principles extend to sales.
The FTC is demanding that companies that settle deception cases keep records of psychological and behavioural research they conduct, including A/B testing. That should be expanded. As more firms use AI to stoke sales, they should expect to face scrutiny.
It is one thing to tempt customers with personalised offers. It is something else entirely to exploit their weaknesses to get them to pay for services they do not need.
brooke.masters@ft.com
Follow Brooke Masters with myFT and on Twitter
[ad_2]
Source link