By Wilson To and Patrick Combes
Navigating the health care system, is often a complex journey involving multiple physicians from hospitals, clinics, and general practices. At each junction, healthcare providers collect data that serve as pieces in a patient’s medical puzzle. When all of that data can be shared at each point, the puzzle is complete and practitioners can better diagnose, care for, and treat that patient. However, a lack of interoperability inhibits the sharing of data across providers, meaning pieces of the puzzle can go unseen and potentially impact patient health.
The challenge of achieving interoperability
True interoperability requires two parts: syntactic and semantic. Syntactic interoperability requires a common structure so that data can be exchanged and interpreted between health information technology (IT) systems, while semantic interoperability requires a common language so that the meaning of data is transferred along with the data itself. data fluidity. But for this to work, organizations must look to technologies like artificial intelligence (AI) and machine learning (ML) to apply across that data to shift the industry from a fee-for-service – where government agencies reimburse healthcare providers based on the number of services they provide or procedures ordered, to a value-based model that puts focus back on the patient.
The industry has started to make significant strides toward reducing barriers to interoperability. For example, industry guidelines and resources like the Fast Healthcare Interoperability Resources (FHIR) have helped to set a standard, but there is still more work to be done. Among the biggest barriers in Canada right now is the fact there are significant variations in the way data is shared, read, and understood across healthcare systems, which can result in information being siloed and overlooked or misinterpreted. For example, a doctor may know that a diagnosis of dropsy or edema may be indicative of congestive heart failure, however a computer alone may not be able to draw that parallel. Without syntactic and semantic interoperability, that diagnosis runs the risk of getting lost in translation when shared digitally with multiple health providers.
Bringing insight to the exam room with the help of AI and ML – what’s next?
Advanced technologies like AI and ML can help health care organizations achieve both syntactic and semantic interoperability. For example, Vancouver General Hospital (VGH) and University of British Columbia (UBC) researchers are using advanced technologies from Amazon Web Services to create their own machine learning models that can triage x-rays to provide a better healthcare experience. For example, imagine a patient comes into the hospital with symptoms of pneumonia. The doctor can take an x-ray, which is then analyzed by an ML model trained that interprets the image for the indication of infection. The algorithm can then determine the priority for that study to be seen by a radiologist. The result? The patient in need can be prioritized to be evaluated more quickly and put on a treatment plan in less time than would have ordinarily taken to capture, assess, and diagnose. With healthcare data that is interoperable, hospitals are poised to leverage AI to complement the complete clinical record to deliver care more effectively and efficiently. More importantly, it can breakdown silos—using technology to evaluate scans will help to create a standard that can bring us closer to delivering on the promise of patient-centered medicine.
An opportunity to transform the industry
As technology creates more data across healthcare organizations, AI and ML will be essential to help take that data and create the shared structure and meaning necessary to achieve interoperability.
As an example, one U.S. supplier of health information technology solutions is deploying interoperability solutions that pull together anonymized patient data into longitudinal records that can be developed along with physician correlations. Coupled with other unstructured data, this supplier uses the data to power machine learning models and algorithms that help with earlier detection of congestive heart failure.
As healthcare organizations take the necessary steps toward syntactic and semantic interoperability, the industry will be able to use data to place renewed focus on patient care. In practice, one of the available digital platforms, stores and analyses 15 petabytes of patient data from 390 million imaging studies, medical records and patient inputs—adding as much as one petabyte of new data each month. With machine learning applied to this data, the company can identify at-risk patients, deliver definitive diagnoses and develop evidence-based treatment plans to drive meaningful patient results. That orchestration and execution of data is the definition of valuable patient-focused care—and the future of what we see for interoperability drive by AI and ML in Canada. With access to the right information at the right time that informs the right care, health practitioners will have access to all pieces of a patient’s medical puzzle—and that will bring meaningful improvement not only in care decisions, but in patient’s lives.
Wilson To is Global Healthcare Business Development lead and Patrick Combes is Global Healthcare IT Lead at AWS.