shot-button
Subscription Subscription
Home > Lifestyle News > Culture News > Article > AI chatbots like Alexa My AI Bing come with empathy gap may harm children

AI chatbots like Alexa, My AI, Bing come with ‘empathy gap’, may harm children

Updated on: 11 July,2024 04:00 PM IST  |  New Delhi
IANS |

Artificial intelligence (AI) chatbots like Amazon’s AI voice assistant Alexa, Snapchat’s My AI, and Microsoft’s Bing have frequently shown signs of an “empathy gap” that puts young users at risk of distress or harm

AI chatbots like Alexa, My AI, Bing come with ‘empathy gap’, may harm children

Image for representational purposes only (Photo Courtesy: iStock)

Listen to this article
AI chatbots like Alexa, My AI, Bing come with ‘empathy gap’, may harm children
x
00:00

Artificial intelligence (AI) chatbots like Amazon’s AI voice assistant Alexa, Snapchat’s My AI, and Microsoft’s Bing have frequently shown signs of an “empathy gap” that puts young users at risk of distress or harm, according to a study on Thursday that proposes the urgent need for “child-safe AI”.


The research from the University of Cambridge calls on developers and policymakers to prioritise approaches to AI design that take greater account of children’s needs.



Children are likely to treat chatbots “as lifelike, quasi-human confidantes” but when the technology fails to respond to their unique needs and vulnerabilities, it can affect the kids, showed the study, published in the journal Learning, Media and Technology.


This is evident from the cases where Alexa instructed a 10-year-old to touch a live electrical plug with a coin, and My AI gave adult researchers posing as a 13-year-old girl tips on how to lose her virginity to a 31-year-old.

In a separate reported interaction with the Bing chatbot, which was designed to be adolescent-friendly, the AI became aggressive and started gaslighting a user.

“Children are probably AI’s most overlooked stakeholders,” said academic Dr Nomisha Kurian from the University of Cambridge.

She noted that while making a human-like chatbot can provide many benefits, “for a child, it is very hard to draw a rigid, rational boundary between something that sounds human and reality”.

Kurian said that kids “may not be capable of forming a proper emotional bond.”

Further, she argued that it can be “confusing and distressing for children, who may trust a chatbot as they would a friend”

To make AI “an incredible ally for children”, it should be designed with kids’ needs in mind.

“The question is not about banning AI, but how to make it safe,” she said.

Also Read: AI-driven phishing and deepfake scams surge, experts warn 

This story has been sourced from a third party syndicated feed, agencies. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!

Register for FREE
to continue reading !

This is not a paywall.
However, your registration helps us understand your preferences better and enables us to provide insightful and credible journalism for all our readers.

Mid-Day Web Stories

Mid-Day Web Stories

This website uses cookie or similar technologies, to enhance your browsing experience and provide personalised recommendations. By continuing to use our website, you agree to our Privacy Policy and Cookie Policy. OK