- 4 Top Trends in Healthcare Data Analysis to Adopt Today
- Advanced Text Mining Improves Medicare Advantage Coding
- Creating an Inpatient Clinic, the Future of Inpatient Medicine
- The Power of User Virtualization: Meeting Meaningful Use, Optimizing IT and Clinical Productivity
- Health Information Exchange Toolkit
Healthcare quality analysis has always been a challenge, even as the push for metrics increased in the past two decades, and even with digital data.
In the early 1990s, Jonathan Handler, MD, CMIO at the health software and data firm M*Modal, was working as an emergency physician and informaticist at a large health system in the Midwest. He remembers trying to access treatment and diagnosis data — and the hurdles he encountered.
“It was basically impossible for me to get any clinical data out of our clinical systems. Not only was it technically difficult, because the systems were actually not designed to share information, but the institution that I worked for was actually not that wound up about doing the work and all the other things related to giving me the information,” Handler said. “Anytime you had to do any kind of study — population health, administrative work — you basically had to make an individual request from IT, and it was a months-long process.”
Like physicians may be doing today with EHR systems, he found a workaround — in this case getting administrative data with ICD-9 codes from office staff, and using the codes for clinical data.
Today, with modern EHR systems, clinicians may have an easier time getting clinical data — but not all of it, which is a problem for providers pursuing population health goals. It’s also a problem as federal health officials and patient-safety organizations like the National Quality Forum try to transition from process-based quality measurements — of which there are very many, sometimes being onerous to report — to outcomes-based metrics, as NQF CEO Christine Cassel, MD told the Senate Finance Committee recently.
One problem is that only about 10 percent of measures submitted for endorsement by the NQF are digital specified for digital use. Another is that only 60 percent of physicians were able to generate population health quality data with their EHRs, as of early 2012, according to a recent Annals of Internal Medicine study.
Driving that second problem in particular may be the fact that EHRs have been adopted with data silos intact, from payers, pharmacies and other providers in a patients’ sphere. As M*Modal’s Handler said: “If we’re really going to drive population health and research, it has to be reasonably feasible for the appropriate people to get access to the rest of the data — the problem lists, the medication lists, the orders that were done, the natural language-understood codes generated out of all the free text,” he said.
Without that data integration, it’s difficult for physicians to do the type of care redesign the ONC’s Farzad Mostashari, MD, has been espousing — figuring out exactly how many of your diabetic patients are meeting their A1C test and medication regimens, for example.
“You have to see a much bigger picture — quality metrics, cost and utilization metrics, and those things are typically not reported in an EMR,” said Deborah Robin, MD, medical director at the accountable care technology company Lumeris and a geriatrician and rheumatologist who spent 24 years working in research and teaching at Vanderbilt University Medical Center.
Now working to help providers, especially in accountable care ventures, use data from their EHRs with relevant patient data from payers, pharmacies and other providers, Robin can understand both the quality metric fatigue and the frustration with EHR limitations that physicians have been encountering, such as in the Annals survey.
“I was a fee-for-service physician. I worked for an institution that had a phenomenal EMR and CPOE, which made my documentation and ability to see information about my patients extraordinarily easy,” Robin said. “But there really wasn’t quality and cost-utilization transparency in a sort of coordinated way. The institution was really focused on quality, but the cost and utilization piece was not as well developed.”
Finding the right tests and procedures for the right patients and avoiding over-utilization is, of course, a key goal of health reform’s accountable care. But it’s hard for providers to track utilization without appropriate data, which if it’s happening at another provider is usually in the hands of payers, Robin said.
[See also: Is Big Data the new oil?]
Since the federal government is the largest recipient of Medicare quality data and is also tracking other provider metrics, what could it do? “Clearly there are valiant efforts to move in this direction, and they are sort of the ignitors,” Robin said. “But if payers and providers could collaborate together in an open way, that would help move all of this together.”
Offering physicians clinically-integrated data on one platform is one of the best options available for both population health management and quality reporting, Robin said. “If data can be pulled in from EMRs, labs, PBM and payers and you pull it together and you superimpose analytics, you can truly provide accountable care," Robin added. "The data is now actionable information.”
Those platforms can help create worklists of patients, say of diabetics, and show data on hemoglobin A1C testing and even data on what other providers are prescribing and whether the medications are filled.
“If you can get a list of all those who haven’t got tests in the last 6 months, someone can call the patients to come in and get one. And then you can look at all the patients, see which ones might be a good fit for disease management programs,” Robin said.
That type of data integration may also, in the long run, simplify quality reporting and make that quality reporting more meaningful — with data on the outcomes of those diabetic patients, rather than the processes used during the physician-patient encounter.