With the continuous influx of new medical technologies being created, one would assume that interoperability was a given; that information systems would work successfully together within and across organisations in order to achieve optimum healthcare for all. Yet, despite our digitally advanced world, there are many issues regarding the compatibility of industry machines. Hindered by legal framework, restrictive covenants, regulation and non-reciprocal technology, interoperability is unable to advance or to properly support medical professionals, resulting in less-effective patient care.
In the world of medical imaging, interoperability has become increasingly important. Not one surgery is performed without an image being taken, making it an essential step in the creation of an effective and efficient patient care plan. It also helps to determine, at the final stages, whether or not such a plan can be deemed as successful, and allows a continuous exchange of information between doctors and their patients.
“Patients are demanding access to their imaging studies, because they are getting savvy to the fact that digital data can help them avoid unnecessary, dangerous and time-wasting testing; fend off co-pays; take charge of getting second opinions; and see specialists outside of their home health system’s network,” explained Matthew Michela, the president and CEO of lifeIMAGE, in a 2016 essay ‘The Road to Interoperability’.
A certified grey area, it is surrounded by confusing legislation, complex technologies and even more complex acronyms. So, what can be done to enable interoperability to work properly?
Collaboration for success
In order to cut unnecessary costs from healthcare delivery, especially in the US – where medical costs can become very high – imaging stakeholders are beginning to show interest in working with providers who open up their networks to image sharing, and who better organise their technology systems for improved connectivity. Likewise, those that form such partnerships are demanding that their healthcare systems embrace data interoperability. Within the industry, there is a sincere desire to work together, but it is hindered by stumbling blocks.
However, as Michela explains, “the current technology of EHR/RIS data system silos creates technology roadblocks between medical images and patients, as well as their care teams.”
Such systems are designed to encourage connectivity and enable different teams to work together, a simple concept that should already be achievable.
As it stands, healthcare organisations are collecting, storing and analysing more medical images than ever before. While this allows them to be used as a resource, it also makes digital imaging networks such as picture archiving and communication systems (PACS) critical to health IT infrastructure.
PACS provides storage and convenient access to medical images such as ultrasounds, MRIs, CTs and X-rays, and became more popular during the transition from film images to digital files.
“Today’s move towards collaborative care means more physicians need to have access to these images and image data, like radiology reports,” explained IDC Research, a global provider of market intelligence, in a recent document. “Providers making care-management decisions want longitudinal records that provide a 360° patient view.”
According to the Office of the National Coordinator for HIT, 97% of the hospitals tracked by the US Government possessed a certified EHR system, but, due to the fact that patients use different providers in multiple locations – including hospitals, physician offices, post-acute care facilities, pharmacies, retail clinics, labs and imaging facilities – it is difficult to put the relevant medical information into the hands of those who need it.
The patient comes first
Between 2010 and 2013, US hospitals spent $47 billion annually on health information technology (HIT), according to the American Hospital Association. This eye-watering amount gives some indication to the necessity of digital systems within healthcare – of which imaging is an essential part. With so much being spent, those in charge of IT and imaging systems must have full faith in a products capability and longevity; it must last the test of time, but also be adaptable to digital developments.
– Matthew Michela, lifeIMAGE
Years of IT specialism has spawned disparate systems driven by particular strategies for handling specific types of data. Imaging specialists benefitted first through picture archiving and communications systems. Other specialists and general practitioners followed with electronic medical records systems.
Today, as data is spread far and wide, the strategies that guided the acquisition, storage and transmission of specific types of data are being homogenised to allow access to caregivers in multiple departments/enterprises, regardless of the type of department or whether the data was collected within in or outpatient facilities.
And so they should. Data is collected for the sole reason of improving patient health and safety. Care coordination facilitates good healthcare and helps to keep a lid on costs by optimising drug expenditure, testing and billing, according to research published by Excerpta Medica.
With this in mind, it makes perfect sense that clinical data be acquired and shared interoperably and seamlessly in forms usable by doctors, nurses, nurse practitioners, other staff and patients.
This has led some providers to seek out a ‘single-stack solution’ – as ITNOnline.com terms it – a single healthcare IT system that handles all facets of diagnosis and treatment.
To serve the patient, data must be accurate, and that accuracy must be maintained during the exchange. Critically important information must not be held up by methods used to ensure its security, such as its encryption and decryption, or because the caregiver doesn’t know the password.
Efficiency is also important for the provider to remain financially viable. Patients must be managed effectively despite continuing reductions in reimbursements as medical practice shifts from feebased to value-based care.
Effective and well-organised data exchange is crucial for the patient to benefit and the provider to create a system that works.
When automating processes, therefore, IT vendors must be vigilant that the automation not only accelerate performance, but does so without compromising patient care. IT developers must also remain vigilant in finding and using new technologies that will encourage efficient and effective data sharing.
Imagine that
In recent years, not all those involved in the process of helping patients had access to the complete records for patents. This created a dangerous grey area where stakes were high and mistakes could be deadly.
But, hopefully, now that interoperability has made it to the core of modern healthcare system processes, this should become a thing of the past.
Ultrasound imaging for alpha fetoprotein may improve early liver cancer detection
A new study at UT Southwestern’s Simmons Cancer Center has found that ultrasound imaging in combination with a blood test for alpha fetoprotein (AFP) could lead to an improvement in the detection of early-stage liver cancer by up to 40%.
Earlier detection is essential for improving the survival of patients with liver cancer, a disease that is on the rise and the fastest increasing solid-tumour cancer in the US. The study was a meta-analysis of 32 previous research works, and the findings were recently published in the journal Gastroenterology.
Early detection of cancer enables doctors to perform curative therapies, thereby extending patients survival to many years.
However, most liver cancers are diagnosed at later stages, when curative treatment is not possible and the rate of survival is much lower.
The US National Cancer Institute (NCI) states that the incidence of most cancers is decreasing in the US due to better living standards. However, the incidence of liver cancer has actually risen by 2.7% annually over the past decade. It has predicted that approximately 40,700 new cases of liver cancer will be diagnosed in the US alone in 2018.
Risk factors for liver cancer, also known as hepatocellular carcinoma (HCC), include chronic heavy alcohol consumption, non-alcoholic fatty liver disease related to diabetes and obesity, and hepatitis C infection.
The symptoms of HCC include loss of weight or appetite, white chalky stools, upper abdominal pain or swelling and general fatigue.
Liver cancer screening guidelines for patients with cirrhosis vary, with some guidelines calling for just imaging and other guidelines calling for imaging and blood tests.
Liver cancer screening in patients with chronic liver disease has traditionally been performed using an abdominal ultrasound. While ultrasound is readily available and non-invasive, it misses many cancers when they are small.
“Our results highlight the importance of continued development and validation of blood-based biomarkers for the early detection of liver cancer,” says Dr Singal Dedman, family scholar in clinical care at UT Southwestern. “Most importantly, our results support a change in clinical practice, and the routine use of ultrasound and biomarkers together for liver cancer screening.”