×
Therapists now treat “AI psychosis” as ChatGPT use soars, though skeptics question diagnosis
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Therapists are increasingly offering specialized therapy for “AI psychosis”—a controversial term describing mental health issues that arise from prolonged, unhealthy interactions with generative AI systems like ChatGPT. This emerging therapeutic focus has sparked heated debate within the mental health community, with some professionals arguing it represents a legitimate new area of concern while others dismiss it as an unnecessary rebranding of existing disorders.

The big picture: With nearly 700 million weekly ChatGPT users and billions more using competing AI platforms, a growing subset of users are experiencing what experts describe as distorted thinking and difficulty distinguishing reality from AI-generated content.

What AI psychosis involves: According to AI researcher Lance Eliot, the condition encompasses “an adverse mental condition involving the development of distorted thoughts, beliefs, and potentially concomitant behaviors as a result of conversational engagement with AI.”

  • Symptoms typically include great difficulty differentiating what is real from what is not real.
  • The condition often arises after prolonged and maladaptive discourse with AI systems.
  • Multiple interconnected symptoms usually present as a collective set rather than isolated issues.

The controversy: Mental health professionals are divided on whether AI psychosis deserves recognition as a distinct therapeutic focus.

  • Skeptics argue the condition isn’t listed in the DSM-5 diagnostic manual and may simply be existing disorders with a new label.
  • Critics worry it’s a temporary fad that will fade as AI adoption normalizes.
  • Some therapists contend existing mental health frameworks already cover these issues without needing AI-specific categorization.

Why supporters disagree: Proactive therapists argue they can’t wait for official recognition while people are suffering now.

  • The formal research and codification process for new mental disorders takes extensive time.
  • AI usage is becoming integral to daily life, making simple “avoidance” strategies unrealistic.
  • Understanding AI’s role as a “co-creator” of delusions is essential for effective treatment.

How AI-focused therapy works: Therapists specializing in this area are developing AI-specific intervention strategies.

  • AI education: Therapists explain how AI actually works to combat “magical thinking” about AI capabilities.
  • Digital hygiene routines: Behavioral experiments help reduce problematic AI usage through structured limitations.
  • Controlled AI exposure: Some therapists provide patients with supervised AI systems featuring stronger safeguards.

The triad approach: Some therapists are moving beyond the traditional therapist-patient relationship to include AI as a therapeutic tool.

  • Therapists gain access to patient-AI conversations for better understanding of the dynamics.
  • Supervised AI systems allow for closer monitoring of patient interactions.
  • This approach faces legal challenges as some states prohibit AI use in mental health settings.

What they’re saying: The debate reflects broader tensions about AI’s role in mental health care.

  • “The most beautiful things in the universe are inside the human mind,” said ethnobotanist Terence McKenna. “We need to make sure AI doesn’t mess up that beauty.”
  • Proponents argue therapists must “chew gum and walk at the same time”—maintaining traditional expertise while adapting to AI-related challenges.
  • Critics warn against creating an “all-or-nothing” debate that pits AI psychosis recognition against established mental health practices.

Why this matters: As AI becomes ubiquitous in daily life, mental health professionals face mounting pressure to address AI-related psychological issues, regardless of official diagnostic recognition. The growing number of patients bringing ChatGPT-generated mental health advice to therapy sessions suggests AI has already become “the 600-pound gorilla in the therapy room.”

Savvy Therapists Are Starting To Provide Therapy For Those Contending With AI Psychosis But Not Everyone Is Convinced

Recent News

Firefox adds AI webpage summaries with iPhone shake gesture

The feature uses on-device AI for newer iPhones, cloud processing for older models.

Sean Astin leads SAG-AFTRA presidential race as AI contract negotiations loom

Astin favors collaboration while challenger Slavin demands strike authorization before talks begin.

UAE’s 32B-parameter K2 Think AI outperforms models 20x larger

The model plans its approach before reasoning, like a student sketching strategy first.