AI Could Manipulate Users For Profit In ‘Intention Economy’

Share This Post


Cambridge study warns AI systems’ deep knowledge of users’ personalities and online habits could make them ideal tools for social manipulation

AI systems such as chatbots and digital tutors could use covert signals to influence users’ decision-making in a potentially “lucrative yet troubling” emerging market, University of Cambridge researchers said in a new study.

This potential marketplace, which researchers termed the “intention economy”, would involve the buying and selling of “digital signals of intent” to “covertly influence” everything from buying movie tickets to voting.

Such “persuasive technologies” are becoming enabled through the growing use of “anthropomorphic” AI systems, the paper said.

Such systems combine a detailed knowledge of users’ online habits with a growing ability to know the user and anticipate his or her desires and build “new levels of trust and understanding”, said co-authors Yaqub Chaudhary and Jonnie Penn of Cambridge University’s Leverhulme Centre for the Future of Intelligence (LCFI).

Image credit: Unsplash

‘Social manipulation’

If unregulated such systems could allow for “social manipulation on an industrial scale”, said the two researchers in a paper published in the Harvard Data Science Review.

“AI tools are already being developed to elicit, infer, collect, record, understand, forecast, and ultimately manipulate and commodify human plans and purposes,” said Chaudhary.

LLMs can be tailored to target a user based on their cadence, politics, vocabulary, age, gender, online history, and preferences for flattery and ingratiation, the research postulated.

This could be linked with other emerging AI technology that sells the ability to influence people toward particular decisions, such as buying a cinema ticket, or engaging with particular platforms, advertisers, businesses or political organisations, the study found.

“Unless regulated, the intention economy will treat your motivations as the new currency,” said co-author Penn.

“It will be a gold rush for those who target, steer, and sell human intentions.”

‘Unintended consequences’

Penn said society should begin considering the likely impact such a marketplace would have on human aspirations, free will, fair elections, a free press and fair market competition “before we become victims of its unintended consequences”.

Public awareness of such issues is “the key to ensuring we don’t go down the wrong path”, Penn said.

Authorities, researchers and others have expressed concern that as AI becomes more advanced it could be used to manipulate people.

Such issues have spurred the development of regulations such as the EU’s AI Act.

Similar concerns have been raised around social media platforms, which have been connected to several high-profile episodes of apparent manipulation of elections.



Source link

spot_img

Related Posts

spot_img