ChatGPT Is As Good As Humans At Answering Healthcare Queries, Says Research

ChatGPT, and AI endeavors like it, might well go on to shape our destinies. After all, AI has now proven that it can build the CPUs of other machines, and very efficiently to boot. Our wary emphasis must be on using such technology as a tool to be wielded, to support us, rather than as an insidious enemy we're giving just the right weapons to use against us when it's ready. At its core, ChatGPT is a highly advanced chatbot, and it seems that its capabilities in that area may help us to live longer, healthier lives.

While health professionals will always be the first port of call for any concerns, gaining access to them for every little complaint is difficult. This is why triage services and the like are established. ChatGPT, apparently, may be a helpful source of healthcare information too: It can do a surprisingly good job of doling out healthcare advice.

Here's what one June 2023 study concluded on the subject, and how AI could work alongside health professionals in the future to streamline the process for them and their patients.

The medical chat of chatbots

Oded Nov, Nina Singh, and Devin Mann's "Putting ChatGPT's Medical Advice to the (Turing) Test: Survey Study" appeared in JMIR Medical Education Volume 9. The research was aimed at investigating how well sophisticated chatbots can tackle concerns from patients, and whether the latter would take their responses on board.

To accomplish this, a series of 10 legitimate medical queries were chosen from the record in January of 2023 and adapted for anonymity. ChatGPT, provided with the queries, was prompted to give its own response to them, and for the ease of comparison, was also prompted to keep its answer around as long as that of the human health professional. From here, respondents had two important questions to answer: Could they tell which of the answers were written by the bot, and did they accept the ones that were?

Almost 400 participants' results were tabulated, and they proved interesting. The researchers note in the study that "On average, chatbot responses were identified correctly in 65.5% (1284/1960) of the cases, and human provider responses were identified correctly in 65.1% (1276/1960) of the cases." This is just under two-thirds of the time, overall, and it also appeared that there was a limit to the sort of healthcare support participants wanted from ChatGP: "trust was lower as the health-related complexity of the task in the questions increased. Logistical questions (eg, scheduling appointments and insurance questions) had the highest trust rating," the study states.

Dr. ChatGPT? Not quite

Though the study's sample size was limited, the conclusions that can be drawn from it are intriguing. The fact that a chatbot can address patient queries ultimately "as well" as professionals can is notable in and of itself, but there's much more at work here. ChatGPT as we currently know it is, the study seems to suggest, most effective at dealing with logistics. This is the role that those surveyed are most comfortable with such a system having, rather than anything more personal, more complex, more health-related. For these things, it's a professional, human opinion that patients crave.

In May 2023, Statista reported that there were a total of 1,077,115 "professionally active physicians" in the United States, a nation of approximately 335.1 million. In countries such as Gambia and Guinea, meanwhile, the ratio of doctors per 1000 members of the population was just 0.1 in 2016, according to Statista. Of course, the invaluable effort of a wide range of professionals besides physicians is key to healthcare, but this makes one thing plain: as the global population increases, it may become harder and harder to ensure personal, medical support reaches those that need it.

The quick access to targeted support could chatbots provide, if cultivated in the right way, may be important to helping with this in the future. If it can build great Spotify playlists, who knows what it might be capable of in the future.