Jenna Glover, chief clinical officer at the mental health care platform Headspace, says AI companies whose chatbots are used for therapy should involve trained clinical personnel in the therapy process.

The role of artificial intelligence in mental health care is an unsettled issue. States including Illinois, Utah, and Nevada limit or ban the use of AI for therapy. But that’s not stopping a lot of people from seeking out such help from generative AI tools.
OpenAI says about 2% of its chats are about “companionship” or “social-emotional issues.” At the same time, researchers say such conversations can sometimes veer off course and can be dangerous, even. So, is it possible to do this safely?
For insight, Marketplace’s Nova Safo spoke with Jenna Glover, chief clinical officer at the mental healthcare platform Headspace, which launched an AI assistant called Ebb last year.
“Caring for Your Mental Health” from the National Institute of Mental Health
“Building more helpful ChatGPT experiences for everyone” from OpenAI