It may be tempting to try ChatGPT “therapy,” but the risks aren't worth it, experts say.
The recent suicide death of a young woman led her parents to a painful revelation: She’d been confiding in a ChatGPT “therapist” named Harry, and she told it that she was planning to die.
While the chatbot didn’t seem to encourage her to take her own life, …