Getting Advice from Medical Chatbots Can Be Dangerous

The old computer saying, "Garbage in, garbage out," signifies just one reason why it's often misleading to consult with chatbots about medical concerns -- it can even be fatal in extreme cases

Chatting about your health with artificial intelligence (AI) can be an obstacle course: It's easy for chatbots to misunderstand questions or misinterpret health statements, and in a number of cases the AI chat relied on flawed or inaccurate information in making its "diagnosis" or doling out advice.

The most concerning problem appears to be a tendency of some people to believe what they hear from AI and chatbots as "truth."

One doctor pointed out that chatbots are really only a new way to do a faster Google search, so if the bot goes to an inaccurate source for its advice, you'll get wrong information.

The host of the radio program called "Your Health First," Dr. Joe Galati, advises that one must always talk with a medical team before considering a diagnosis to be official; don't consider information from AI chatbots to be official.

"Number one, talk with your doctor. Number two, the information [from AI] may be correct, but in the wrong context of your disease," he says.

And of course a chatbot won't know or perhaps even understand your medical history, and that could be crucial.

How much would be the right dose for your height and body weight? AI chatbots may not take that into consideration. And they wouldn't have previous experience with the effects of drugs or other curative measures like medical professionals would have.

There are increasing reports of people believing the AI chatbot so-called "diagnosis," and the endings were tragic.

"Your Health First" appears Sunday evenings at 7 pm on Newsradio 740 KTRH in Houston.


Sponsored Content

Sponsored Content