using ai to respond to patient questions
Can technology, specifically chatbots, be utilized to provide health-related information to patients? If so, how would this compare to human healthcare professionals?
There has been substantial discussion regarding artificial intelligence (AI), including applications of chatbots, such as ChatGPT, in the past several months. This has included integration within the healthcare domain. Past articles in this series have focused on a discussion of various pertinent considerations related to the implementation of this technology within the healthcare field. One of the most frequently contemplated aspects regarding this emerging technology is the extent to which it will replace human interactions.
As has been discussed in further detail in past articles, there are a number of barriers to the implementation of AI within the healthcare profession which are unique to the field. One such consideration is that consumers of healthcare, specifically patients, have an expectation of receiving not only treatment from the system, but also to feel cared for through their interactions with healthcare professionals. At least in part, based upon this, it has been previously noted that the general public does not prefer technology to replace human providers within healthcare, but rather serve as a double check or second opinion.
One application of AI using chatbots that has been widely discussed is the ability of the technology to provide detailed information in response to questions. This is a widely help capability of the technology which has been explored in many different capacities. A common component to the practice of medicine is providing information in response to questions posed by patients, family members, and the general public. Given the capabilities of chatbots, it becomes interesting to consider whether or not response to health-related questions may be a potential application of the technology.
In an attempt to investigate this question, Ayers et al (2023) compared the responses provided by physicians to those from a chatbot. They conducted a cross-sectional study which analyzed the responses provided by physicians on a forum on the social media platform Reddit and compared those to that which was provided by ChatGPT, an AI chatbot. The quality of information and degree of empathy were assessed by a group of three healthcare professionals.
The study analyzed a total of 195 randomly chosen questions and responses. The responses provided by ChatGPT were preferred in 78.6 per cent of cases. It was noted that the length of the response provided by physicians was significantly shorter than that provided by the chatbot. In addition, the quality of information provided by the chatbot was determined to be significantly higher than the physicians. The extent of empathy in the responses was significantly higher amongst the chatbot response compared to the physicians. The authors concluded that further research of the application of the chatbot technology within clinical scenarios was necessary but there was reason to expect that integration of this technology would be beneficial.
There are several limitations inherent within the study that are important to consider. The population of physician responders may not be representative of the larger population of healthcare professionals because, firstly, they are functioning within the Reddit social media forum and, secondarily and perhaps more importantly, are not the patients’ actual physicians. While empathy can be conveyed through the written word there is no question that direct human to human interaction is a more significant determinant of empathy. As such the results of this study do not necessarily convey the true empathic content of physician responses in a real-world scenario and do not generalize to in person conversations between patients and their healthcare professionals.
The adequacy of the content and degree of empathy of the responses were both determined by healthcare professionals and not by patients. The potential bias of this assessment as opposed to a patient-based evaluation must be contemplated. It would be informative to include patient evaluation of the responses including a direct comparison between the physician evaluation and patient-based evaluation for each response. It is also important to consider, as outlined in previous articles in this series, that patients have directly articulated their perception of the importance of direct human interaction within the healthcare setting, particularly related to receiving care. This preference should be contemplated when integrating technology, such as AI chatbots, within healthcare environments.
What the evidence provided by this study does portray is the potential contribution that AI, in particular chatbots, could provide in the content of information that physicians provide to their patients and the general public. When combining the evidence provided by the study with that which has been discussed in past articles, the potential application of chatbots within the clinical setting to serve as a second opinion or double check to the information provided by healthcare professionals can be envisioned. In addition, by using this technology to assist with responding to patient questions, this task could be transitioned away from healthcare professionals. Alleviating this responsibility, at least in part, from humans would be of benefit in allowing healthcare professionals to focus their attention on roles and tasks that only they can perform and reduce a substantial and time consuming aspect of current practice.
The evidence provided by the study demonstrates an application of AI which would be potentially beneficial to patients, the general public, and healthcare professionals. This reveals the capability to leverage the technology in a complimentary fashion within healthcare. This is one of the perspectives of House Call Media, specifically to implement AI and related technology in a fashion that contributes towards the improved provision of healthcare.
TO LEARN MORE, PLEASE VISIT WWW.HOUSECALLMEDIA.COM
Join our Substack
Follow us on Instagram
To learn more about the healthcare trained services provided by House Call Media, please visit www.housecallmedia.com.
REFERENCE
Ayers JW, Poliak A, Dredze M, Leas EC, Zhu Z, Kelley JB, Faix DJ, Goodman AM, Longhurst CA, Hogarth M, Smith DM. Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum. JAMA Intern Med. Published online April 28, 2023. doi:10.001/jamainternmed.2023.2838.