HomeMedical SpecialtiesSafe MedicationArtificial intelligence and patient care: Considerations of medication safety

Artificial intelligence and patient care: Considerations of medication safety

Published on

Artificial Intelligence (AI) has generated significant excitement globally. One notable AI system is ChatGPT (https://chatgpt.com/), which has become widely used for information seeking and ideas generation, etc. It has significantly simplified the lives of many by providing rapid access to information and various resources, influencing not only individuals, but also the health care system. However, convenience and efficiency may also come with the potential of uncertainty regarding the accuracy of the information generated by AI.

AI in Clinical Practice: Context Matters

The increasing utilization and accessibility of AI presents potential risks to patients and clinical practice. Online tools, such as symptom checkers, have been publicly available. Users may share experiences where these tools are “trained” to generate a list of possible diagnoses based on the symptoms presented or entered by the patient. A user presenting with stomach pain may end up with diagnoses ranging from constipation to colon cancer. With such a wide range of “possible diagnosis” outputs, these tools often lack the nuanced clinical context that healthcare providers would consider in their patient assessments. It is possible that patients might self-diagnose and self-treat, potentially worsening their condition or exposing themselves to the risks of side effects due to inappropriate medication use. In addition, with increasing access to online drug-drug interaction checkers, patients may identify a potential interaction, self-assess and decide to stop taking their medication(s) without consulting their healthcare providers, thereby compromising their health outcomes.

The advancement of technology facilitates information access – information seeking (by patients and healthcare practitioners) and information generating (e.g., by various online tools and AI systems). However, the subsequent use and application of the information obtained would require clinical reasoning and critical appraisal skills in addition to the consideration of clinical contexts that are unique to individual patients. In health profession education, students and trainees go through clinical training to develop their critical thinking skills and apply their clinical knowledge to diverse patient scenarios. No single patient is alike. It is the role and responsibility of a healthcare professional to contextualize patient health information and make appropriate recommendations accordingly.

AI in Clinical Practice: Taking Caution

Despite undergoing rigorous training, missing essential clinical contexts is still possible among healthcare providers. The introduction of prescribing algorithms can assist healthcare providers in performing patient assessments to some extent, but they should not be solely relied upon. This caution arises from the recognition that a patient’s health is influenced by a complex array of factors, particularly, the social determinants of health. It is important to acknowledge the potential biases in these AI generated algorithms based on their development process, the developers involved, the specific patient populations studied and used, etc. Clinical reasoning may be lacking in these algorithms. Clinical contexts and social determinants of health may not be fully considered either. The need for critical thinking is crucial particularly when considering red-flag symptoms to make differential diagnoses, for instance. Should red-flag symptoms within the context of a patient’s characteristics be missed, it can have a profoundly negative impact on their health.

Over-reliance on AI tools may have adverse effects on the health outcomes of marginalized populations, as these algorithms might not account for the unique circumstances of individuals in such groups. Patients may often fall into a grey area not precisely covered by the AI generated algorithm’s criteria. In such cases, healthcare providers must exercise their clinical judgment to make the best possible recommendations for patients, underscoring the importance of clinicians possessing a thorough understanding and clinical reasoning of the conditions they are assessing.

Final Remarks: AI is Here to Stay

The use of AI should be embraced responsibly, both by healthcare providers and by patients. Healthcare professionals have greater accountability than ever to contextualize and discern information as AI technology becomes smarter and simpler to use for patients and clinicians. Doing so will help mitigate potential biases in AI tools that may impact patient health outcomes.

Recently, a 2024 Journal of the American Pharmacists Association (JAPhA) publication (https://www.japha.org/article/S1544-3191(24)00139-0/abstract) identified that medication-related patient education materials generated by ChatGPT presented variable accuracy of information. Therefore, with respect to medication safety, patient should always consider the following five questions to ask about their medications (https://www.ismp-canada.org/medrec/5questions.htm) during a clinical encounter with their healthcare
providers:

• Are there any changes in my medications?

• What medications do I need to keep taking?

• How should I use this medication correctly?

• What should be monitored while taking this medication?

• Is there any follow-up required after starting this medication? 

The use of AI to answer medical/medication related questions may seem like a reasonable approach; however, be mindful that patients’ health is influenced by a complex set of factors that AI is not well equipped enough to answer for the time being.

By Samir Kanji, Neil Patel, and Certina Ho

Samir Kanji is a PharmD Student at the Leslie Dan Faculty of Pharmacy, University of Toronto; Neil Patel is a 2023 PharmD graduate at the Leslie Dan Faculty of Pharmacy, University of Toronto; and Certina Ho is an Assistant Professor at the Department of Psychiatry and Leslie Dan Faculty of Pharmacy, University of Toronto.

Latest articles

Breaking Barriers with Mobile Care

In Canada, marginalized populations face many barriers to accessing the health care they need,...

Advancing women’s health research and care

Historically, women have faced barriers in the diagnosis, treatment and care of many health...

It is time the feds make the long-awaited diabetes device fund a reality

It has been almost a year since the federal government announced that it would...

What 20 years of competency-based medical education has taught us

When the first conversations around Competency-Based Medical Education (CBME) took root in Canada in...

More like this

Creating tiny biomedical factories from common bacteria

Engineered bacteria secrete powerful nanoparticles to aid in drug delivery, vaccines and treating medical...

A new app can greatly improve ventriculostomy safety and accuracy

Access to potentially life-saving neurosurgical care remains very uneven worldwide, with potentially life-threatening consequences....

Accelerating drug development with AI

Waterloo researchers use machine learning to predict how new drugs could affect the body.  Developing...

Using AI to improve hand hygiene and patient safety

The Ottawa Hospital (TOH) is the first in Canada to implement the Artificially Intelligent...

Centralized booking and registration making hospital appointments easier

Scarborough Health Network (SHN) is delivering a superior experience for patients accessing healthcare services,...

Timely wrap-around support for people with type 1 diabetes

A pilot project of the new REACHOUT app offered accessible and flexible support for...