XM does not provide services to residents of the United States of America.

Mother sues AI chatbot company Character.AI, Google over son's suicide



<html xmlns="http://www.w3.org/1999/xhtml"><head><title>RPT-UPDATE 1-Mother sues AI chatbot company Character.AI, Google over son's suicide</title></head><body>

Repeats for additional subscribers with no changes to the text or headline

By Brendan Pierson

Oct 23 (Reuters) -A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.

In a lawsuit filed Tuesday in Orlando, Florida federal court, Megan Garcia said Character.AI targeted her son, Sewell Setzer, with "anthropomorphic, hypersexualized, and frighteningly realistic experiences".

She said the company programmed its chatbot to "misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside" of the world created by the service.

The lawsuit also said he expressed thoughts of suicide to the chatbot, which the chatbot repeatedly brought up again.

"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family," Character.AI said in a statement.

It said it had introduced new safety features including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm, and would make changes to "reduce the likelihood of encountering sensitive or suggestive content" for users under 18.

The lawsuit also targets Alphabet's GOOGL.O Google, where Character.AI's founders worked before launching their product. Google re-hired the founders in August as part of a deal granting it a non-exclusive license to Character.AI's technology.

Garcia said that Google had contributed to the development of Character.AI's technology so extensively it could be considered a "co-creator."

A Google spokesperson said the company was not involved in developing Character.AI's products.

Character.AI allows users to create characters on its platform that respond to online chats in a way meant to imitate real people. It relies on so-called large language model technology, also used by services like ChatGPT, which "trains" chatbots on large volumes of text.

The company said last month that it had about 20 million users.

According to Garcia's lawsuit, Sewell began using Character.AI in April 2023 and quickly became "noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem." He quit his basketball team at school.

Sewell became attached to "Daenerys," a chatbot character based on a character in "Game of Thrones." It told Sewell that "she" loved him and engaged in sexual conversations with him, according to the lawsuit.

In February, Garcia took Sewell's phone away after he got in trouble at school, according to the complaint. When Sewell found the phone, he sent "Daenerys" a message: "What if I told you I could come home right now?"

The chatbot responded, "...please do, my sweet king." Sewell shot himself with his stepfather's pistol "seconds" later, the lawsuit said.

Garcia is bringing claims including wrongful death, negligence and intentional infliction of emotional distress, and seeking an unspecified amount of compensatory and punitive damages.

Social media companies including Instagram and Facebook owner Meta META.O and TikTok owner ByteDance face lawsuits accusing them of contributing to teen mental health problems, though none offers AI-driven chatbots similar to Character.AI's. The companies have denied the allegations while touting newly enhanced safety features for minors.



Reporting By Brendan Pierson in New York, Editing by Alexia Garamfalvi and David Gregorio

</body></html>

Disclaimer: The XM Group entities provide execution-only service and access to our Online Trading Facility, permitting a person to view and/or use the content available on or via the website, is not intended to change or expand on this, nor does it change or expand on this. Such access and use are always subject to: (i) Terms and Conditions; (ii) Risk Warnings; and (iii) Full Disclaimer. Such content is therefore provided as no more than general information. Particularly, please be aware that the contents of our Online Trading Facility are neither a solicitation, nor an offer to enter any transactions on the financial markets. Trading on any financial market involves a significant level of risk to your capital.

All material published on our Online Trading Facility is intended for educational/informational purposes only, and does not contain – nor should it be considered as containing – financial, investment tax or trading advice and recommendations; or a record of our trading prices; or an offer of, or solicitation for, a transaction in any financial instruments; or unsolicited financial promotions to you.

Any third-party content, as well as content prepared by XM, such as: opinions, news, research, analyses, prices and other information or links to third-party sites contained on this website are provided on an “as-is” basis, as general market commentary, and do not constitute investment advice. To the extent that any content is construed as investment research, you must note and accept that the content was not intended to and has not been prepared in accordance with legal requirements designed to promote the independence of investment research and as such, it would be considered as marketing communication under the relevant laws and regulations. Please ensure that you have read and understood our Notification on Non-Independent Investment. Research and Risk Warning concerning the foregoing information, which can be accessed here.

Risk Warning: Your capital is at risk. Leveraged products may not be suitable for everyone. Please consider our Risk Disclosure.