- CMS explains federal data hub, as state insurance regulators send HHS HIX questions
- PGP project pays millions in incentives, saves Medicare millions
- HIMSS lauds federal leaders, lawmakers with NHITweek awards
- Tavenner confirmation triggers applause from industry
- Archimedes, HHS align to speed HIT development via analytics
- Senate confirmation hearing set for Tavenner
- Q&A: On health data 'we can't dream big enough'
- AMA, ACAP call on Senate to confirm Tavenner as CMS head
- AMA, CHIME Stage 3 comments focus on usability, speed and scale
- New state commission to focus on healthcare costs
- The VNA Strategy: Balancing Workflow and Enterprise Imaging Management
- Saving Lives Virtually – A Day in the Life of Today’s Physician
- Futureproofing Healthcare with Converged Medical Infrastructure
- Taming Complexity: A New Solution for In-House Healthcare EDI
- Best Practices for the Implementation of Telepresence in a Telehealth Solution
The Centers for Medicare and Medicaid Services will have to manage and analyze double the volume of Medicare data and triple the terabytes of Medicaid data after health reform is fully in place.
By 2015, the waves of Medicare claims data will explode from 370 terabytes to 700 terabytes. For Medicaid, 30 terabytes of data will multiply to 100 terabytes, according to a CMS official.
CMS has been upended by health reform just as providers have, and the agency is transforming how it operates and communicates with physicians and hospitals to be ready for the roll out of the health reform law in 2014.
The Patient Protection and Affordable Care Act (ACA) requires a huge IT infrastructure so that CMS can manage the shift to pay healthcare providers based on quality and to analyze the data that supports provider improvements in performance of patient care, according to Tony Trenkle, CMS CIO.
CMS has historically been a decentralized, stove-piped agency with mushrooming volumes of data and separate data centers springing up around specific programs. Now CMS is concentrating on adopting enterprise and shared services and establishing the capabilities to collect, analyze and use real-time data.
“We have very little time to change. And we’re already feeling the first winds of the tsunami,” he said at a recent conference sponsored by the Bethesda, Md., chapter of AFCEA, which promotes industry and federal agency partnership and IT innovation.
“The linchpins are data and IT,” he added. “When you talk about Big Data, it’s not only the volume, but the volume and complexity together,” he said. And there will be volumes more data as CMS migrates from fee-for-service and traditional capitation models for payment to a variety of programs that will pay based on quality, performance and shared savings.
[See also: Half of docs nationwide e-prescribe using EHRs]
Previously, the data that CMS collected and held was just related to claims, then the agency moved into quality data. More recently, CMS has built up encounter data from Part C prescription drug plans and some clinical data from the HITECH Act programs starting in 2011. A large amount of the data is unstructured and not machine-readable.
CMS has created an Office of Information Products and Data Analysis to work across the enterprise and with the IT infrastructure in a coordinated effort to make data and tools more available and usable.
Data analytics is beginning to be used for fraud prevention for scrutinizing geographic variations and will evaluate if payment models improve patient care or bend the cost curve.
“We need to internally enable the business use of data without having to be a programmer or expert user to utilize it,” he said.
Trenkle wants to get more of the data and the online tools that support it pushed down to end users, such as ACO communities, and not just power users, so they can understand how well they’re performing. States are also looking for more Medicare data to understand the healthcare gaps and needs in their counties and region.
“The idea is to provide community users with pools of information and get that information out quicker,” he said.
CMS has conducted secure testing with the Energy Department’s Oak Ridge National Laboratory on different approaches and tools for a virtual data center environment.
“The idea with Oak Ridge was to get the folks who do Big Data with the National Weather Service and DOD to see if some of the work they did in those areas could be looked at in relation to the healthcare arena,” Trenkle said, such as faster and more efficient data management and a knowledge discovery infrastructure.
Now the agency will apply that approach to the agency’s business and operational environment at its main Baltimore data center.
A virtual data center will cut CMS’ footprint from 80 data centers down to six to eight data centers, Trenkle said. The agency recently awarded a contract to create a virtual data center environment that will enable more scalable, sustainable, integrated and consolidated uses of data.
CMS has started to establish the foundation for shared and enterprise services and will accelerate that in 2013 to be ready for the data management needs in 2014, including for health insurance exchanges and ACOs. CMS is focusing on shared services for:
• Identity management to conduct identity proofing and issue credentials, for when providers or other users connect with the agency’s 175 applications
• Enterprise portal, tied with identity management, to have a single face to the healthcare industry for services, such as registration and reporting
• Master data management to track payments and relationships across multiple payment programs
• Business rules engine that defines and applies policies and calculations to determine who gets what and how much under the various payment programs, such as for dual Medicare and Medicaid eligible and patient-centered medical homes.