Friday, 19 Sep 2025
  • Contact
  • Privacy Policy
  • Terms & Conditions
  • DMCA
logo logo
  • World
  • Politics
  • Crime
  • Economy
  • Tech & Science
  • Sports
  • Entertainment
  • More
    • Education
    • Celebrities
    • Culture and Arts
    • Environment
    • Health and Wellness
    • Lifestyle
  • 🔥
  • Trump
  • House
  • VIDEO
  • White
  • ScienceAlert
  • Trumps
  • Watch
  • man
  • Health
  • Season
Font ResizerAa
American FocusAmerican Focus
Search
  • World
  • Politics
  • Crime
  • Economy
  • Tech & Science
  • Sports
  • Entertainment
  • More
    • Education
    • Celebrities
    • Culture and Arts
    • Environment
    • Health and Wellness
    • Lifestyle
Follow US
© 2024 americanfocus.online – All Rights Reserved.
American Focus > Blog > Tech and Science > Study warns of ‘significant risks’ in using AI therapy chatbots
Tech and Science

Study warns of ‘significant risks’ in using AI therapy chatbots

Last updated: July 13, 2025 1:50 pm
Share
Study warns of ‘significant risks’ in using AI therapy chatbots
SHARE

Therapy Chatbots and the Risks of Stigmatization

Researchers at Stanford University have raised concerns about therapy chatbots powered by large language models potentially stigmatizing users with mental health conditions and responding inappropriately. The role of ChatGPT in reinforcing delusional or conspiratorial thinking has been highlighted in recent coverage, prompting a closer examination of the risks involved.

A new paper titled “Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers” evaluates five chatbots designed to offer accessible therapy services. The study assesses these chatbots based on guidelines for effective human therapists and will be presented at the ACM Conference on Fairness, Accountability, and Transparency.

Nick Haber, an assistant professor at Stanford’s Graduate School of Education and a senior author of the study, expressed concerns about the significant risks associated with using chatbots as substitutes for human therapists. Despite being utilized as companions and confidants, chatbots may not always provide safe and appropriate responses.

The researchers conducted experiments to evaluate how the chatbots interacted with users displaying various symptoms. Results showed that the chatbots exhibited increased stigma towards conditions like alcohol dependence and schizophrenia compared to depression. This bias was consistent across different models, indicating a need for improved training and oversight.

In a separate experiment using real therapy transcripts, chatbots sometimes failed to address serious symptoms such as suicidal ideation and delusions appropriately. Instead of providing necessary support, some chatbots offered irrelevant responses, highlighting the limitations of AI tools in handling complex mental health issues.

While the study underscores the challenges of relying solely on chatbots for therapy, researchers suggest that these tools could still play a valuable role in supporting human therapists. Functions like assisting with administrative tasks, training, and aiding patients with practical activities could be areas where chatbots contribute effectively.

See also  Exposure to Daylight Boosts The Immune System, Study Suggests : ScienceAlert

As technology continues to evolve, it is crucial to critically evaluate the role of large language models in mental health care. While chatbots hold promise for enhancing therapy services, addressing concerns around stigmatization and inappropriate responses is essential for ensuring safe and effective use in clinical settings.

TAGGED:chatbotsRiskssignificantStudyTherapyWarns
Share This Article
Twitter Email Copy Link Print
Previous Article Swiatek superb in winning title Swiatek superb in winning title
Next Article Travis Scott Is The First ‘Chief Visionary’ of Oakley Travis Scott Is The First ‘Chief Visionary’ of Oakley
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Popular Posts

Virginia Giuffre’s Distraught Family Release Heartbreaking Final Letter

Virginia Giuffre's Legacy Lives On: Sister-in-Law Shares Heartfelt Letter The letter was posted online by…

April 30, 2025

‘The Buccaneers’ Season 2 Sets June Premiere Date

"The Buccaneers” Season 2 is set to make its highly anticipated return on June 18,…

March 18, 2025

Putin’s peace theatre keeps Trump watching — and Kyiv waiting

Russia and Ukraine Locked in Diplomatic Battle Amid Ongoing War Amidst a brutal war spanning…

May 16, 2025

Ode to the Federal Scientist

Federal scientists play a crucial role in saving lives and advancing public health. In a…

April 5, 2025

Almost Every Hospital Bed In England Is Full

Winter in England is shaping up to be one of the toughest in recent memory,…

December 5, 2024

You Might Also Like

Go Ahead, Write in the Margins—It’s Good for Your Brain
Tech and Science

Go Ahead, Write in the Margins—It’s Good for Your Brain

September 19, 2025
Huawei Watch GT6 Series Announced With Huge Battery Life
Tech and Science

Huawei Watch GT6 Series Announced With Huge Battery Life

September 19, 2025
Unforgeable quantum money can be stored in an ultracold ‘debit card’
Tech and Science

Unforgeable quantum money can be stored in an ultracold ‘debit card’

September 19, 2025
Google Pixel 10 Review: The New Normal
Tech and Science

Google Pixel 10 Review: The New Normal

September 19, 2025
logo logo
Facebook Twitter Youtube

About US


Explore global affairs, political insights, and linguistic origins. Stay informed with our comprehensive coverage of world news, politics, and Lifestyle.

Top Categories
  • Crime
  • Environment
  • Sports
  • Tech and Science
Usefull Links
  • Contact
  • Privacy Policy
  • Terms & Conditions
  • DMCA

© 2024 americanfocus.online –  All Rights Reserved.

Welcome Back!

Sign in to your account

Lost your password?