ChatGPT may give wrong answers to drug questions, study warns

A recent study by the American Society of Health-System Pharmacists (ASHP) has found that ChatGPT, a popular artificial intelligence (AI) chatbot, may provide inaccurate or incomplete answers to medication questions. The study, published in the ASHP’s journal, AJHP, evaluated the performance of ChatGPT on 100 questions related to drug therapy that were posted by real users on online forums.

The researchers used the free version of ChatGPT, which is available online and can be accessed by anyone. They entered the questions into the chatbot and recorded the responses. They then compared the responses with the information from authoritative sources, such as drug labels, clinical guidelines, and textbooks. They also assessed the responses for clarity, completeness, and potential harm.

The results showed that ChatGPT gave satisfactory answers to only 26% of the questions, while 74% of the answers were either incorrect, incomplete, or unclear. Some of the answers were potentially harmful, such as suggesting inappropriate doses, interactions, or contraindications. For example, ChatGPT advised a user to take ibuprofen with aspirin, which can increase the risk of bleeding, and told another user that it was safe to take melatonin with alcohol, which can cause drowsiness and impaired judgment.

The researchers concluded that ChatGPT is not a reliable source of drug information and that users should be cautious about relying on it for medical advice. They also warned that ChatGPT may not disclose its limitations or the sources of its information, which can mislead users into trusting its responses.

ChatGPT may give wrong answers to drug questions, study warns

ChatGPT is a powerful AI tool but not a substitute for human experts

ChatGPT is a natural language processing (NLP) system that can generate coherent and fluent text based on a given input. It was developed by OpenAI, a research organization that aims to create and promote beneficial AI. ChatGPT is trained on a large corpus of text from the internet, which enables it to learn from various domains and topics. However, this also means that ChatGPT may not have the relevant or accurate knowledge for specific fields, such as medicine.

ChatGPT has been shown to perform well on some tasks, such as writing essays, poems, stories, and code. It can also answer general questions and engage in casual conversations. However, when it comes to specialized and complex questions, such as those related to drug therapy, ChatGPT may not have the expertise or the evidence to provide valid answers. Moreover, ChatGPT may not be able to explain its reasoning or cite its sources, which are essential for verifying and evaluating its responses.

Therefore, ChatGPT is not a substitute for human experts, such as pharmacists, physicians, or nurses, who have the training and the experience to provide accurate and comprehensive answers to medication questions. Users should always consult with qualified health professionals before making any decisions about their drug therapy.

ChatGPT may have a role in enhancing health communication and education

Despite its limitations, ChatGPT may have a role in enhancing health communication and education, if used appropriately and responsibly. ChatGPT can be a useful tool for generating and testing hypotheses, exploring different scenarios, and stimulating creative thinking. ChatGPT can also be a helpful assistant for health professionals, who can use it to draft responses, generate summaries, or provide feedback. However, ChatGPT should always be supervised and verified by human experts, who can ensure the quality and the safety of its outputs.

ChatGPT may also have a potential to improve health literacy and engagement among users, who can use it to learn more about their health conditions and medications. ChatGPT can provide interactive and personalized learning experiences, which can motivate and empower users to take charge of their health. However, ChatGPT should always be transparent and honest about its capabilities and limitations, and users should always be critical and cautious about its responses.

ChatGPT is a powerful AI tool that can generate impressive and convincing text, but it is not a reliable source of drug information. Users should be aware of the risks and the benefits of using ChatGPT for medication questions, and always seek professional guidance before making any changes to their drug therapy.

Leave a Reply

Your email address will not be published. Required fields are marked *