New research published in Health Affairs suggests that use of electronic health records (EHRs) can help improve quality of care after all. During the six-year study period (2008-2013), researchers found that mortality rates were initially higher among hospitals with more digital capabilities, but fell over time, as hospitals learned how to work with the technology and adopted new capabilities. 

A common viewpoint is that EHRs are not improving clinical care, with many doctors complaining EHR documentation increases their administrative burden. However, as the researchers note, their findings underscore the importance of allowing time for technology to prove its worth. 

"In other industries, widespread digitisation took a decade to realise improvements," said senior author Julia Adler-Millstein, PhD, an associate professor in the Department of Medicine, University of California - San Francisco, and the Philip R. Lee Institute for Health Policy Studies. "It's a major transformation of the healthcare system to go from paper to digital. We are seeing those rewards, but it has taken time and work."

In the study, smaller and non-teaching hospitals gained more benefits from digitisation. The researchers hypothesised this was because the larger and teaching hospitals had ongoing efforts to improve hospital quality, and therefore had less room to improve with the adoption of health records. In contrast, for smaller and non-teaching hospitals, EHR adoption may have represented a large, highly visible quality improvement initiative that also prompted broader quality efforts. 

The researchers examined data from 3,249 hospitals across the United States, measuring quality by looking at 30-day mortality rates for 15 common conditions for patients who were 65 years and older. They selected a study timeframe beginning in 2008, because that is when national data was first collected about the adoption of EHRs. While many hospitals, particularly large and teaching hospitals, already had EHR capabilities by then, many adopted new technology following passage of the HITECH Act – Health Information Technology for Economic and Clinical Health – which provided $30 billion in 2009 to stimulate a broad national investment in new technology.

To better parse the differential effects of EHR adoption, the research team examined three distinct phases: baseline EHR functions; the maturation of these baseline functions; and the adoption of new EHR functions. 

The results showed that baseline adoption was associated with a 0.011 percentage point higher mortality rate per function. Over time, the maturation of these baseline functions was associated with a 0.09 percentage point lower mortality rate per function per year. The third category – adoption of new EHR functions – was associated with a 0.21 percentage point reduction in mortality rate per year per function.

"Hospitals implement functionality over time, because it's really hard to go from fully paper to fully electronic overnight," Adler-Milstein explained. "We measured EHR adoption in a way that was truer to the way adoption likely occurred. As hospitals added functionalities over time, there was benefit from each of those new features."

«« How to waste time and money - don't invest in cybersecurity


Can a bot take over primary care? »»



Latest Articles

EHR, electronic health records, EHR adoption New research published in Health Affairs suggests that use of electronic health records (EHRs) can help improve quality of care after all. During the six-year study period (2008-2013), researchers found that mortality rates were initially higher among hos