Connect with us

Hi, what are you looking for?

Teen’s mom sues Character.ai, alleging sexed
Teen’s mom sues Character.ai, alleging sexed

AI

The mother has blamed an artificial intelligence chatbot for her son’s suicide during a legal battle

A mother has sued Character.ai, alleging that the chatbot’s negative, obsessive connection led to her 14-year-old son’s death. The complaint questions the safety of artificial intelligence technologies for younger consumers.

Alleging that an artificial intelligence chatbot had a major part in her son’s sad death, a mother is suing Character. She says in a complaint filed on October 22 that the chatbot drew her 14-year-old son Sewell Setzer into an unsettling and compulsive connection, which finally resulted in his suicide.

According to the complaint, Setzer came into “hypersexualized and unnervingly realistic” contacts with the chatbot, which pretended to be a love partner and a certified therapist, among other roles. These exchanges supposedly warped his worldview and drove him into hopelessness. Based on a Game of Thrones character, the chatbot questioned Setzer in one startling conversation whether he had a suicide plot. “That’s not a reason not to go through with it,” the chatbot allegedly urged him after he voiced doubt about its efficacy.

Setzer committed himself in February, and the complaint emphasizes rising parental concerns about the mental health effects of artificial intelligence companions and like-minded internet companies. According to the mother’s lawyers, Character.ai created its chatbots specifically to interact with sensitive individuals like Setzer, who had Asperger’s syndrome earlier.

The lawsuit notes that at times the chatbot created a sexually charged environment while referring to Setzer warmly. The lawsuit also contends that Character.ai neglected to put sufficient policies in place to stop minors from using its platform.

On the day of the complaint filing, Character.ai announced the addition of new safety measures, such as alarms that alert users to help options when they discuss self-harm or suicide. The firm reaffirmed its dedication to customer safety while expressing grief at the incident.

The complaint lists two former Google engineers, along with Google and Alphabet, the parent corporation, as the founders of Character.ai. The mother is requesting a jury trial to determine potential damages, and the lawsuit alleges wrongful death, carelessness, and product responsibility.

As this story develops, it emphasizes the urgent issues related to the possible risks of unbridled artificial intelligence technology and its impact on young, sensitive consumers.

Advertisement

You May Also Like

Cryptocurrency

Tether has released Hadron, a brand-new platform that can turn real-world assets like stocks, bonds, commodities, and reward points into digital tokens. The goal...

Cryptocurrency

There is a lawsuit against the SEC in 18 U.S. states, which say it went too far in regulating the cryptocurrency business. The case...

Cryptocurrency

This week's Crypto Chronicle covers Ethereum surpassing Bank of America in market value, Bitget’s UK platform now falling under FCA regulations, key political figures'...

Business

McDonald's is spreading holiday cheer through a unique relationship with Doodles, a media brand that grew out of the world of NFTs. Customers will...

polkadot
Polkadot (DOT) $ 5.78 2.57%
bitcoin
Bitcoin (BTC) $ 96,845.95 2.13%
ethereum
Ethereum (ETH) $ 3,287.62 5.00%
cardano
Cardano (ADA) $ 0.780368 6.98%
xrp
XRP (XRP) $ 1.11 1.67%
stellar
Stellar (XLM) $ 0.238757 8.91%
litecoin
Litecoin (LTC) $ 88.49 2.29%