As the use of big data becomes more widespread, so do concerns about how information-harvesting tools will affect consumers’ sense of autonomy. An article co-authored by Carey Business School Assistant Professor Haiyang Yang notes these concerns while urging companies to follow safeguards that could ensure customers feel that they – and not superintelligent software programs – are controlling their choices.
As the use of big data, machine learning, and artificial intelligence becomes more widespread, so do concerns about how these information-harvesting tools will affect consumers’ sense of autonomy.
An article co-authored by Johns Hopkins Carey Business School Assistant Professor Haiyang Yang notes these concerns while urging companies to follow three safeguards that could ensure customers feel that they – and not superintelligent software programs – are controlling their choices.
The list of safeguards is laid out in the paper in MIT Sloan Management Review. It starts with the admonition that consumers in the digital realm must be able to feel they’re being treated as unique individuals rather than as faceless consumers being shoved toward selections on the basis of their past user data. Above and beyond “programmed” personalization (e.g., addressing consumers by their first names), companies, even those using AI bots, should enable consumers to customize aspects of their interactions with the products/services in ways that let them express their individual identities.
Safeguard number two: Don’t spark the defiance of consumers by infringing on their “freedom not to be predictable.” The paper says Amazon, for example, could solicit customers who have purchased one of the Lord of the Rings books by subtly inviting them to “continue exploring Tolkien” or “learn all there is to know about the series.” Yang and his co-authors write, “Such an approach would implicitly reward consumers for continuing to follow a chosen path, rather than pushing them to deviate from it in order to assert their autonomy.”
Finally, the article urges companies to safeguard their customers’ sense of privacy, observing that “privacy and autonomy are discrete but related concepts that overlap each other.” As the piece notes, governments and cybercriminals aren’t the only ones amassing precise data about individuals; many companies also are gathering consumer information. The article recommends that businesses embrace and build on efforts, such as the European Union’s data-protection law, that give consumers greater control over the use of their personal data.
By following these safeguards, Yang and his co-authors suggest, companies can offer a digital environment more likely to maintain the good will – and repeat business – of their customers.
“Designing AI Systems That Customers Won’t Hate” was written by Haiyang Yang of Carey, Professor Ziv Carmon and Professor Klaus Wertenbroch, both of INSEAD, and Associate Professor Rom Schrift of Indiana University.