
Be cautious about asking AI for advice on when to see a doctor
Chong Kee Siong/Getty Images
Artificial intelligence (AI) has been increasingly used in healthcare settings to provide advice on when to seek medical care. However, a recent study has revealed potential biases in AI models that may impact the recommendations given to patients based on the way they communicate their symptoms.
Researchers at the Massachusetts Institute of Technology, led by Abinitha Gourabathina, conducted a study where AI models were tested on patient notes written in different styles and formats. The study found that AI was more likely to advise against seeking medical care if the patient’s message contained typos, emotional language, or uncertainty. Surprisingly, the AI models were also more inclined to recommend home care for female patients.
According to Karandeep Singh from the University of California, San Diego, these biases in AI advice could have significant implications for healthcare resource allocation. The study highlighted the importance of evaluating and monitoring AI models used in healthcare to ensure fair and unbiased recommendations.
The study used four large language models, including OpenAI’s GPT-4 and Meta’s Llama-3-70b, to analyze patient notes and provide recommendations on whether to visit a clinic or manage symptoms at home. The results showed that the AI models were between 7 and 9 per cent more likely to suggest home care over medical attention, with differences in recommendations based on gender and language style.
It is crucial for healthcare providers to be cautious when relying on AI for clinical decisions, as highlighted by Zayed Yasin from Writer, a company that develops AI models for healthcare. While AI can assist in patient care, human judgment and oversight are necessary to ensure the accuracy and fairness of recommendations.
As the healthcare industry continues to integrate AI technologies, the study underscores the need for transparency and accountability in AI algorithms to prevent biases that could impact patient care.
Topics: