How You Should—and Shouldn't—Use ChatGPT for Medical Advice

<p>Photo Illustration by Amelia Manley for Verywell Health; Getty Images</p>

Photo Illustration by Amelia Manley for Verywell Health; Getty Images

Fact checked by Nick Blackmer




  • Experts say that people should be cautious about using artificial intelligence (AI) tools like ChatGPT for medical advice.

  • While ChatGPT can be useful to learn about certain conditions or symptoms, it can also provide false information or lead to incorrect diagnoses.

  • If you have any questions about your health, experts recommend reaching out to your healthcare provider instead of relying on services like ChatGPT.





While the internet grants people more access to medical information than ever before, that information is not always easy to understand. One way that people are starting to problem-solve their way through this challenge is by using artificial intelligence (AI) based tools like OpenAI’s ChatGPT, Google Bard, and Microsoft Bing.

ChatGPT already has over 100 million users, and that number will only keep rising now that OpenAI has released the official iOS mobile app.

The tools can answer questions in seconds and instantly generate easy-to-understand responses. While most of them do offer premium plans, the basic functions of the tools are free.

Learn More: The Risks of Using the Internet to Diagnose Yourself

Compared to scheduling an in-person visit with a healthcare provider, using ChatGPT seems like a more affordable, convenient, and easy way to get the help you need. Plus, it can answer almost anything you ask, albeit not necessarily correctly. But experts say it should not be used to address questions about your health or for medical advice.

“No such unregulated device should be used for medical advice, given the potential high stakes of people misunderstanding or applying such information to their health,” Jonathan Chen, MD, PhD, assistant professor of medicine and physician-scientist at Stanford University School of Medicine, told Verywell. “Given that both patients and clinicians are likely already doing so anyway, they need to be informed about what they’re actually getting and doing.”

Here’s when you should and should not use AI for health-related questions, and the possible harms to be aware of if you choose to.

When Is It Safe Use ChatGPT for Health-Related Questions?

Even though AI services like ChatGPT should not be used for medical advice, Rigved Tadwalkar, MD, a board-certified cardiologist at Providence Saint John’s Health Center told Verywell that they can be useful for providing general information about health conditions, certain medications, diseases, and other medical topics.

For example, if you want to learn more about the flu, an AI tool can provide responses that cover the symptoms, causes, risk factors, and treatments. The tools can also be beneficial if you want to learn more about a specific medication—for example, why it’s used and what the possible side effects are.






[ChatGPT] is more of an informational tool as opposed to being diagnostic and giving definitive advice the way that most people would want it to.





Tadwalkar said that the tools can be a helpful resource, especially for simple things.

“That’s really where something like AI can shine, and where we see that there is some degree of reliability in the responses,” he said.

According to Chen, tools like ChatGPT could also be useful for summarizing or explaining information in non-medical terms, especially from reliable medical resources, websites, and scientific studies.

The tools could also be a useful reference and dialogue partner to help people prepare for medical visits—for example, they can ask a bot what information they should bring and what questions they should ask their providers at their next appointment.






If a patient is falsely reassured by a chatbot and declines to seek real medical attention, delays or misses in critical diagnoses would be devastating.





Preliminary studies have suggested that ChatGPT does much better than previous online symptom checker tools in terms of guessing a medical diagnosis and suggesting the appropriate “triage,” that is, recommending a person stay home, see their provider, or go to the ER for treatment. However, Chen said that these tools were still wrong over 10% of the time.

“It’s more of an informational tool as opposed to being diagnostic and giving definitive advice the way that most people would want it to,” Tadwalkar said. “This is a good informational resource on a lot of occasions, but it’s not the end-all because it is just not quite at that level.”

Learn More: Study: Searching Your Symptoms on Google May Lead to a Better Diagnosis

Why You Should Not Use ChatGPT for Medical Advice

While ChatGPT can be beneficial in some situations, Tadwalkar said that people should not rely on the tools for definitive medical advice because its answers and suggestions could be incorrect.

“Many either may not be aware of this fact or may be taking it lightly, but ChatGPT can flat out lie sometimes,” he said. “This is where it becomes dangerous.”

In addition, Chen said that patients who rely on ChatGPT for medical information about their symptoms could panic unnecessarily.

“There are harms from both over- and under-diagnosis,” said Chen. “Obviously, if a patient is falsely reassured by a chatbot and declines to seek real medical attention, delays or misses in critical diagnoses (e.g., heart attack or cancer) would be devastating.”

Another drawback of using ChatGPT is that it may miss out on new and developing medical research. With how fast technology changes, responses from a chatbot could easily be false and outdated.

Tadwalkar said that ChatGPT and other AI services do not take personalized information about a patient, like family history, general medical history, medication usage, diet, weight, height, and other lifestyle factors, into account unless it is entered. Therefore, patients who use the service may not get accurate medical advice about their specific diagnoses and treatment plans.

“Users are not often inputting that degree of information. Even if they did, I’m not sure if the AI is in a place where it can recognize all of that,” he said.

Learn More: A Chatbot Can Answer Your COVID Questions

What to Know Before You Ask ChatGPT for Medical Advice

If you decide to use ChatGPT or other AI services for medical questions and concerns, experts say there are some things you should consider and some best practices to follow.

Be Specific and Provide Details

Be sure to provide enough details related to the question you’re asking. For example, Tadwalkar said if you are trying to figure out what your cough could be, add how long you’ve had the cough and whether you have also other symptoms like fever and chills.

You may also want to consider adding some general information about your medical background, such as if you have a history of asthma or chronic obstructive pulmonary disease (COPD).

Be Careful About Sharing Personal Info

It’s fine to enter general information about your health, but Chen said that you need to be cautious about giving chatbots any private data. When you share personal information like your date of birth or a list of all of your medications with a tool, that information is also shared with the companies that own the tech.

“When you enter (copy-and-paste) any personal medical information into these systems, you are uploading private information to a big tech company for them to do whatever they wish to do with it,” he said.

However, users can get around this by paraphrasing general questions. For example, you could try asking: I have a friend that’s over 60 years old with 3 months of a chronic dry cough. What diagnosis and tests should they consider having?

Learn More: ChatGPT Is Great at Taking Medical Licensing Exams. But Can It Replace Doctors?

Do Your Research and Check Sources

Chen said you should do more research and cross-reference any medical advice you get—be it from a chatbot, internet source, or even a human provider.

Use sources that are current and credible—for example, government or educational sources (e.g., websites that end in .gov or .edu). This will help make sure that you’re getting accurate, up-to-date health information.

Even with these suggestions in mind, both Tadwalkar and Chen said that if you turn to ChatGPT for medical advice, try to follow up with your human provider for an in-person or telehealth visit.

“Patients should still see a physician,” Tadwalkar said. “I look at these AI chatbots like ChatGPT as being just complimentary. It’s just another tool in the toolbox. Just like how when you would Google something, you would still go see your physician to see if there’s some accuracy. I would do the same with these AI chatbots.”






While artificial intelligence (AI) services like ChatGPT can be useful educational tools, experts say you should not rely on them for medical advice. If you do use an AI chatbot for health information, you should still follow up with your provider for personalized care.