The Apprentice Doctor

The Growing Risk of Relying on Chatbots for Health Advice: Here’s Why

Discussion in 'Doctors Cafe' started by menna omar, Feb 25, 2025.

  1. menna omar

    menna omar Bronze Member

    Joined:
    Aug 16, 2024
    Messages:
    1,390
    Likes Received:
    2
    Trophy Points:
    1,970
    Gender:
    Female
    Practicing medicine in:
    Egypt

    More People Are Risking Medical Advice From Chatbots. Here’s Why.

    Generative artificial intelligence (AI) tools, such as ChatGPT, have become widely popular in recent years, revolutionizing the way people gather information across various domains, including healthcare. With their ability to provide quick, personalized answers to a wide array of questions, these AI tools are increasingly being used as a resource for health-related queries. But, as convenient and helpful as they can be, there are risks to relying solely on these AI tools, especially for medical advice. Though they offer easy access to health information, they are not a substitute for professional, clinical judgment.

    Generative AI, in particular, ChatGPT, is one of the most prominent tools being used for healthcare inquiries. While the potential for AI to assist in healthcare is vast, there are concerns about the accuracy and reliability of the information provided. AI-based responses may give the appearance of being knowledgeable, but they cannot offer the depth of care and context a medical professional can. As we increasingly turn to these tools for advice, it’s important to understand who is using them, why, and what dangers may arise.

    The Growing Use of AI for Health Questions

    In a new study conducted in June 2024, over 2,000 Australians were surveyed to explore the use of ChatGPT for health-related inquiries. The results were eye-opening. Around one in ten (9.9%) of participants reported having used ChatGPT to ask a health-related question in the first half of 2024. On average, respondents reported a moderate level of trust in the AI tool (3.1 out of 5). This demonstrates how many individuals are turning to AI for medical information, but it also highlights that users may not have full confidence in the tool.

    What’s even more concerning is that many of those turning to ChatGPT for medical advice were individuals who may have limited health literacy. People who had lower health literacy levels, were born in non-English speaking countries, or who spoke languages other than English at home were more likely to use ChatGPT for health inquiries. This may indicate that the tool is helping people who struggle to engage with traditional healthcare resources, particularly in a diverse country like Australia. However, this also opens the door to misinterpretation or incomplete advice when dealing with complex medical issues.

    What Are People Asking?

    The study also looked at the specific questions people were asking ChatGPT. The most common types of inquiries included:

    • Learning about a health condition: 48% of respondents asked ChatGPT about various health conditions.
    • Interpreting symptoms: 37% of users sought advice on what certain symptoms could mean.
    • Seeking medical actions or steps: 36% of individuals wanted to know what actions to take regarding their health.
    • Understanding medical terminology: 35% asked questions about complex medical terms.
    Interestingly, over 60% of respondents had asked at least one question that could typically require clinical judgment. These "riskier" questions may involve interpreting symptoms, making medical decisions, or determining whether an issue warrants professional attention. For example, asking ChatGPT what certain symptoms may mean might give you a rough idea of what could be going on, but it can never replace the expertise of a healthcare professional. If you were to rely on AI for this information, you could miss out on crucial nuances that only a doctor could provide, leading to unnecessary worry or improper treatment.

    The most concerning finding is that people from culturally and linguistically diverse communities were more likely to ask riskier questions, such as those involving symptoms or clinical advice. This group of individuals may already face barriers to accessing quality healthcare information, and ChatGPT can offer them immediate answers. But when the AI tool provides incomplete or inaccurate information, these individuals may be more vulnerable to making poor health decisions.

    The Risk of Relying on AI for Medical Advice

    One of the key issues with using AI tools like ChatGPT for health-related inquiries is that these tools lack the capacity to provide clinical judgment or consider the broader context of a patient's health. Healthcare professionals take into account not only the symptoms but also a patient’s medical history, physical examination, and other important factors that AI simply cannot assess.

    For example, ChatGPT might provide information that is based on general knowledge about health conditions, but it cannot consider the unique characteristics of an individual’s case. Whether you are asking about a new symptom or wondering if you need to go to the hospital, the decision should be based on clinical experience and in-person evaluations. Generative AI tools are prone to errors, particularly when it comes to complex health issues, which could lead to dangerous consequences.

    Furthermore, AI tools often provide information that is generalized and may not be tailored to the needs of the individual. For example, if someone asks about a common cold, ChatGPT may give general advice about symptoms and home remedies, but it cannot evaluate whether the individual has underlying conditions that could make their illness more serious.

    The Growing Popularity of Generative AI for Health Information

    The growing use of generative AI tools for health-related inquiries is not limited to Australia. As technology continues to evolve, more people worldwide are turning to AI for immediate answers to health-related questions. According to our study, 39% of individuals who hadn’t yet used ChatGPT for health information expressed an interest in doing so in the next six months.

    AI tools like ChatGPT, Google Gemini, Microsoft Copilot, and Meta AI are becoming increasingly popular for answering health-related questions. These tools have the potential to democratize access to healthcare information, making it more available to people who may not have easy access to healthcare professionals. However, as more people use these tools, we must be cautious about relying solely on them for important health decisions.

    One important issue is that people from non-English speaking backgrounds may use AI tools to translate health information into their native language. While these translations can be helpful, AI tools are generally less accurate when working with languages other than English. The accuracy of AI-generated translations and health information may vary, which introduces additional risks.

    Building AI Health Literacy

    With the increasing use of generative AI for health, it is crucial that we educate people on how to use these tools safely. This includes developing “AI health literacy,” which involves understanding when it is appropriate to rely on AI for health information and when to seek professional medical advice. AI can offer valuable insights into health topics in simple language, but it is essential to recognize its limitations.

    Healthcare systems must invest in services, whether human or AI-assisted, to ensure that individuals who speak different languages or have low health literacy still have access to high-quality, reliable health information. AI can be a useful supplement to traditional healthcare resources, but it must be used with caution and in conjunction with professional medical guidance.

    What Should You Do When You Need Health Advice?

    When it comes to health decisions that require clinical judgment, it’s essential to consult a healthcare professional. In Australia, organizations like HealthDirect (https://www.healthdirect.gov.au) offer a national, free helpline where you can speak with a registered nurse to get advice about whether you need to go to the hospital or see a doctor. HealthDirect also offers an online SymptomChecker tool that can help you make informed decisions about next steps in managing your health.

    The key takeaway is that while generative AI can provide immediate answers to health questions, it cannot replace professional medical evaluation. Individuals should be educated about AI health literacy and learn how to use these tools responsibly. As AI technology continues to evolve, it’s essential that healthcare organizations and individuals work together to ensure safe, effective use of AI in health.
     

    Add Reply

Share This Page

<