* . * . * .

Her teenage son killed himself after talking to a chatbot. Now she’s suing.


The teen was influenced to “come home” by a personalized chatbot developed by Character.AI that lacked sufficient guardrails, the suit claims.




Source

.. . . . . . . . . . . . . . . . . . . . . . . . . %%%. . . * . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ - - - - - - - - - - - - - - - - - - - - . . . . .