Feb 10 • 15:33 UTC 🇩🇪 Germany FAZ

Chatbots and Health: When Dr. ChatGPT Fails Real Patients

A study reveals that health-related chatbots often fail to provide accurate medical guidance, raising concerns about their reliability.

Many Germans seek medical advice from chatbots like ChatGPT, according to a representative survey conducted by the IT association Bitkom in November 2025. The survey showed that half of the respondents utilize AI-driven tools for health inquiries, and over 50% express confidence in the responses these chatbots provide. Alarmingly, a third of the participants deemed the information given by these chatbots comparable to a doctor's second opinion, highlighting a concerning trust in AI for medical advice.

However, the effectiveness of these chatbots is questionable. A recent study has illuminated the shortcomings of AI in meaningful dialogues with medical novices, demonstrating that even traditional internet searches yield more reliable outcomes. This raises vital questions about the implications of relying on AI technology for health concerns—especially when vulnerable populations might be seeking guidance from entities that do not possess the requisite medical expertise.

In light of these findings, it is essential for consumers to critically assess the information provided by health-focused chatbots. As trust in these tools grows disproportionately compared to their actual performance, public health education initiatives may need to address the potential risks associated with the blind acceptance of AI-generated medical advice, ensuring that patients prioritize professional medical consultation over technology for their health needs.

📡 Similar Coverage