Chatbots are not your friends, warns new video made by team looking at our emotional interactions with AI

21 Aug 2025

Can a chatbot truly be your friend? 

The answer is ‘No’ because the Bot doesn’t care about you. 

That’s one of the messages of a new video produced by Project AEGIS (Automating Empathy – Globalising International Standards) which is seeking to develop guidance to ensure the responsible development of AI that emulates empathy to make sure ‘it augments, not replaces the social experience’.  

Big tech has responded to humans’ capacity for empathy by creating systems programmed to encourage emotional connections and respond to our moods. 

Alexander Laffer, a Lecturer in Media and Communication at the University of Winchester and member of AEGIS, warns that people who become too fond of or too reliant upon AI companions can be open to manipulation. 

This manipulation could simply result in the user being made to part with their money but there is already evidence of far more serious consequences for vulnerable users. 

In the USA, The Social Media Victims Law Center and the Tech Justice Law Project have filed a lawsuit against Character.AI, the company's two co-founders, and Google, on behalf of a parent whose 14-year-old son allegedly took his life after becoming dependent on role-playing with an AI "character". 

In 2023 a UK court heard how a 21-year from Hampshire, who wanted to kill Queen Elizabeth II and broke into Windsor Castle armed with a crossbow, had exchanged more than 5,000 messages with an AI-companion. 

“AI doesn’t care. It can’t care. Children, people with mental health conditions and even someone who’s just had a bad day are vulnerable,” said Alexander (pictured).  

“There has to be a move in education to make people more AI-literate but the AI developers and operators must also have a responsibility to protect the public.” 

These protections could include:  

 

The new video, a primer to automated empathy, is a companion to AEGIS’s ambitious efforts to draft a set of global ethical standards in partnership with the Institute of Electrical and Electronics Engineers (IEEE)  -  a worldwide network of more than 486,000 engineering and STEM professionals. 

AEGIS have been keen to gain global perspectives on AI – most of which has been designed in the US and reflects American and European attitudes. The programme team have run workshops in Tanzania, Japan and Indonesia, to broaden their understanding and inform the standard.  

Alex was co-author of On manipulation by emotional AI: UK adults’ views and governance implications published by Frontiers of Sociology read it here 

Aegis is an offshoot of the Emotional AI Lab, founded in 2017 to examine the social, cultural, legal and ethical impact of artificial intelligence on human emotions. 

 

 

Back to media centre