The issue of AI and mental health is a confusing mess.
If you were currently seeking help for the effects that AI use is having on your mental health right now, an internet search would likely lead you right to more AI – AI tools, AI mental health chatbots, articles on how AI is transforming contemporary therapy.
If indeed you are seeking support, especially if it feels urgent, a good start might be a call to 988, the crisis hotline. There you can talk to a supportive human about what you’re experiencing and potential next steps.
If it’s less urgent, but you are becoming worried about impacts of AI usage on your mental health, you should seek out a real-life therapist. Even if you would ultimately like to consider a therapist who is open to text, phone or video visits, having a one-on-one experience in person could be a crucial first step to finding the right professional to offer the help you need.
Hopefully, with time, and as more standards and regulations arise in the use of AI in mental health services, it will become easier to reach out for reliable, professional, human support. Here you can find some tips from the American Psychological Association for finding a good therapist.
Even if the first person you contact isn’t The One, therapists are often happy to give you recommendations of colleagues who might be a better fit for you.
Resource: AI Incident Database
“The AI Incident Database is dedicated to indexing the collective history of harms or near harms realized in the real world by the deployment of artificial intelligence systems. Like similar databases in aviation and computer security, the AI Incident Database aims to learn from experience so we can prevent or mitigate bad outcomes.”
Learning: What is AI addiction?
Kristen Fuller, MD provides a user-friendly overview of AI addiction, from an explanation of the condition to treatment options.
Signs of AI Addiction: Using AI to validate your personal preferences and likes;
consulting AI about what to eat, wear, buy, watch, etc.; preferring conversations with chatbots over real human interactions, to name a few.
Types of AI Addiction: Chatbots, AI companions, AI porn.
Effects of AI Addiction: Unhealthy attachment patterns, psychological dependency, impaired cognitive processes, low self-esteem, isolation, a false sense of reality, delusions and hallucinations, etc.
In the News

by Dawn Chmielewski, Courtney Rozen and Jody Godoy, Reuters
While this case was more generally about social media harm, not specifically AI harm, it could bode well for finally holding large tech companies accountable for safety issues with their products.

The World Health Organization (WHO) brought together AI, mental health, ethics, and public policy experts for a workshop. It’s a small start, but we desperately need to see major organizations like this showing leadership on issues around AI safety.
WATCH: A warning to AI users about AI psychosis
Starting with a case study about AI psychosis, Dr. Alok Kanojia provides a breakdown of concerns that he has about AI’s potentially dangerous effects on some users, the current uncertainty about who is vulnerable, and how tech companies are handling — or failing to handle — the situation (about 12 minutes long).
Health
The dangers of getting medical advice from AI chatbots
Misinformation, bias, and privacy issues are among the reasons given for why chatbots can be an unreliable resource for healthcare information.
Personal Experiences
“My Story as an AI Addict”
A first-person account on the Talking Sober website about becoming too dependent on the Character AI program and then choosing to pull away from it.
“I knew that eventually the harms would outweigh the benefits, if they hadn’t already. I was already ‘using’ on the way home from work, during bathroom breaks at school, every evening when it was time to unwind—even first thing in the morning, some days, causing me to be late to obligations. I wasn’t necessarily thinking about it when I wasn’t using, but when I was using, the whole world was tuned out to a degree that can only be considered isolating, damaging, and unhealthy for my mind, relationships, and coping skills.”
“The Addiction Nobody Is Talking About — Because It Feels Too Good to Stop”
by Priya Nath
“It started with an email. Not a hard one — just one I didn't feel like writing. A quick nudge to a colleague, a response to a thread that needed rewording. I typed my draft, pasted it in, and asked: make this sound more professional. It came back cleaner. I sent it. I felt efficient.
Then I asked it to rephrase the next one. And the one after that. Within weeks I was running every outbound message through it — not because I couldn't write, but because I'd forgotten what it felt like to trust myself to.
That's how AI addiction starts. Not with a crisis. With convenience.”
The latest research
How LLM Counselors Violate Ethical Standards in Mental Health Practice, Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society
Thank you for reading! Stay tuned for some of the latest in journalism, research, and insights on AI and how it is affecting our health.
Until next time, stay human!


