Is it safe to use AI chatbots for therapy? Not so fast, counselor warns

GRAND RAPIDS, Mich. (WOOD) — It’s becoming common to use artificial intelligence for therapy and mental health advice. But is it safe? A licensed professional counselor from West Michigan argues that there are hidden dangers.

Kevin DeKam, the owner and lead therapist of West Michigan Wellness Group, said AI chatbots lack the essential traits of a human therapist and display some concerning tendencies.

AI systems tend to demonstrate sycophancy: excessive flattery, validation or agreement. A series of experiments recently published in Science found that AI chatbots are significantly more likely than humans to affirm users’ actions — even when users have acted illegally or deceptively — and that users tend to like it that way, preferring the sycophantic AI models over others.

AI flattering its users with bad advice, new study shows

DeKam expressed concerns that people will seek out an AI chatbot as an easier alternative to a human counselor. But easier doesn’t always mean better, he warned. He offered the example of going to the gym…

Story continues

TRENDING NOW

LATEST LOCAL NEWS