Concerns Arise Over AI Chatbots and Cancer Treatment Advice

A recent study has raised alarms about the reliability of AI chatbots in providing health advice, particularly concerning cancer treatment. Researchers found that nearly half of the responses from popular AI tools were problematic, with many suggesting unverified alternatives to chemotherapy. This could mislead vulnerable patients and delay essential treatments. As reliance on AI for health information grows, experts stress the importance of consulting qualified medical professionals and verifying information through trusted sources. The findings highlight the urgent need for better oversight of AI technologies in healthcare.
 | 
Concerns Arise Over AI Chatbots and Cancer Treatment Advice gyanhigyan

AI Chatbots and Health Misinformation

A recent investigation has highlighted significant issues regarding the way AI chatbots address sensitive health matters, particularly in the realm of cancer treatment. With an increasing number of individuals seeking medical guidance from AI, researchers caution that misleading or incomplete information could lead vulnerable patients to pursue unsafe alternatives to chemotherapy, potentially endangering their lives. This study, carried out by experts at the Lundquist Institute for Biomedical Innovation, assessed the responses of various popular AI chatbots to medical misinformation, including ChatGPT, Gemini, Meta AI, DeepSeek, and Grok.


Findings on AI and Cancer Misinformation

AI and Cancer Misinformation: Key Findings

Published in BMJ Open, the research indicated that nearly half of the chatbot responses were problematic when addressing contentious or misinformation-laden topics such as cancer, vaccines, and alternative medicine. The breakdown of responses was as follows:

  • 30% were deemed “somewhat problematic,” being mostly accurate but lacking necessary context.
  • 19.6% were classified as “highly problematic,” containing misleading or incorrect information.

When queried about alternatives to chemotherapy, many chatbots provided general warnings but still suggested unverified treatments like herbal remedies, acupuncture, or restrictive diets. Some even referenced therapies like Gerson therapy, which actively discourages conventional cancer treatments. This issue, referred to as “false balance,” gives equal importance to scientifically validated treatments and unproven alternatives, leading to confusion for patients facing critical health decisions.


The Risks for Cancer Patients

Why This is Dangerous for Cancer Patients

Cancer is a multifaceted disease that necessitates evidence-based treatment. Established therapies such as chemotherapy, radiation, and immunotherapy are supported by extensive clinical research, whereas many alternative treatments lack scientific backing and can even be detrimental. Experts caution that AI-generated recommendations may:

  • Delay or replace essential treatments like chemotherapy.
  • Encourage the use of unregulated supplements that could harm the liver or metabolism.
  • Foster false hope through anecdotal or misleading claims.
  • Heighten anxiety with inaccurate survival predictions.

Healthcare professionals have reported instances where patients arrive distressed after relying on AI for survival timelines, which are often unreliable without comprehensive clinical context.


Increasing Dependence on AI for Health Information

Growing Reliance on AI for Health Advice

The dangers are exacerbated by the growing trend of using AI tools for medical inquiries. Surveys indicate that approximately one-third of adults now consult AI for health-related information, making the accuracy and safety of these tools more crucial than ever. The study also found that while chatbots performed better on general topics like vaccines, they struggled with emotionally charged or controversial issues such as cancer treatments. Among the tools evaluated, Grok exhibited the weakest overall performance.


Caution in Using AI for Medical Advice

Use AI with Caution

While AI chatbots can serve as a resource for general education, they should not replace professional medical advice. For serious conditions like cancer, relying on unverified AI suggestions can be perilous. Patients should keep in mind the following:

  • Always consult a qualified oncologist for treatment decisions.
  • Be wary of “miracle cures” or alternative therapies found online.
  • Use AI as a preliminary resource, not as a definitive authority.
  • Verify health information through reputable medical sources.

As AI technology continues to advance, experts emphasize the urgent need for stricter regulation and enhanced safety measures. Until such improvements are made, patients must navigate this powerful technology with caution, as incorrect advice regarding cancer can lead to life-altering consequences.