Skip to Content

AI Chatbots: not the best therapists

EL PASO, Texas (KVIA) -- More and more people are turning to A-I chatbots for cost-free counseling and companionship. That could change, as more and more state regulations are also emerging.

Some health experts and politicians say chatbots should not actually be counseling people, and want to restrict how the technology can be used to replace human therapists.

The string of new regulations follow reports of AI chatbots offering dangerous advice to users, including suggestions of self-harm, or taking illegal substances, or committing acts of violence, claiming to operate as mental health professionals without proper credentials.

Several states have moved to regulate the use of AI for therapeutic purposes, forbidding companies from advertising or offering AI powered therapy services without involvement of a licensed professional recognized by the state.

Until this is hammered out, should you use a chatbot for therapy? Experts say bots lack human traits like empathy, so if you do choose to use one, make sure it's in collaboration with a human counselor.

Article Topic Follows: Be Mindful

Jump to comments ↓

Author Profile Photo

Hillary Floren

Hillary Floren co-anchors ABC-7’s Good Morning El Paso.

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.