RALEIGH, N.C. (WTVD) -- More and more people are taking their problems not to real, flesh and blood human beings, but to AI chatbots and that includes children.
Now there is a warning from experts.
The Harvard Business Review just came out with this statistic, the top reason people are and will use generative AI in 2025 is for therapy and companionship.
Dr. Andrea Diaz-Stransky with Duke Health says AI can cause disrupted sleep to even harming yourself or others.
"It sounds so much like a human that it quickly transitions to developing of a pseudo relationship and thinking it can help us. We do have reports of individuals, children, and adults that have hallucinated and have delusions in part because of AI messaging," she said.
Despite all this, experts are optimistic saying AI therapy is the future, just not quite yet.
AI expert Alec Coughlin and podcaster in Raleigh helped explain AI hallucinations.
"Which is a fancy way of saying when AI misrepresents something, when it's so confident about what it's representing but is categorically wrong," he said.
The bottom line is AI therapy is a risk.
"We cannot expect it to be safe is important before we put it in front of our children. We are optimistic. There are promising studies that have good outcomes, but the risks are still unknown." said Dr. Diaz-Stransky.
The doctor says with the right guidance, training and clinical oversight, AI therapy could be helpful in the future, especially for those without access to a counselor.