- Why do so few fed execs get Cloud-First and Datacenter Consolidation?
- Commentary: In the age of HIE are we really ready to share patient data?
- Are providers ripe for a massive medical records heist?
- Northrop Grumman steering big data analytics toward health agencies
- Commentary: Small data on Big Data
- Can you offset ICD-10 costs with competing HIT projects?
- Realizing the Promise of Health Information Exchange
- Your Cloud in Healthcare - How to Use the Cloud to Achieve Greater Business Agility
- Sizing Up Your Cloud Options - Is Now the Time?
- Delivering the Future of Healthcare: Maintain Compliance, Improve Efficiency and Continuity of Care...Virtually Anywhere
- Palomar Health Choses EXTENSION's Alert Management Software Solution
Whether the cliché ‘Big Data’ itself lasts another ten years or fades into the same oblivion as catchphrases such as e-business, the impact of big data is almost certain to last, if not surpass, e-business in time.
That’s the assertion, at least, put forth by the TechAmerica Foundation, a non-profit, non-partisan institution that gauges the impact of technology and, in turn, aims to educate the public with its findings.
But what is big data, anyway?
TechAmerica’s Big Data Commission set out to sculpt a definition and released it on Wednesday. Here it is:
Big Data is a term that describes large volumes of high velocity, complex, and variable data that require advanced techniques and technologies to enable the capture, storage, distribution, management, and analysis of the information.
Comprehensive, if not catchy. Not that it was easy. Nick Combs, CTO of EMC’s federal arm, is on TechAmerica’s Big Data Commission and as such provides advice to the administration. Combs told Government Health IT during an interview over the summer that “you wouldn’t believe the amount of discussion just on finding a definition of big data.”
TechAmerica explained in its report that “big data is a phenomenon characterized by the exponential expansion of raw data that is inundating government and society. It is already here and it is accelerating.”
Thus, it’s up to government agencies to figure out how to harness big data for their particular needs. TechAmerica recommends five steps for agency heads to embark toward effectively using big data.
- Understand the “Art of the Possible.” Explore the case studies contained in this report, posted on the TechAmerica Foundation Website, and otherwise in the public domain to find inspiration and practical examples.
- Identify 2-4 key business or mission requirements that Big Data can address for your agency, and define and develop underpinning use cases that would create value for both the agency and the public.
- Take inventory of your “data assets.” Explore the data available both within the agency enterprise and across the government ecosystem within the context of the business requirements and the use cases.
- Assess your current capabilities and architecture against what is required to support your goals, and select the deployment entry point that best fits your Big Data challenge – volume, variety or velocity.
- Explore which data assets can be made open and available to the public to help spur innovation outside the agency. Consider leveraging programs like the Innovation Corps offered by the National Science Foundation, or the Start-Up America White House initiative.
And for policy makers, TechAmerica suggested five tactics to make sense of big data:
- Expand the talent pool by creating a formal career track for line of business and IT managers and establish a leadership academy to provide Big Data and related training and certification.
- Leverage the data science talent by establishing and expanding “college-to-government service” internship programs focused specifically on analytics and the use of Big Data.
- Establish a broader and more long-lasting coalition between industry, academic centers, and professional societies to articulate and maintain professional and competency standards for the field of Big Data.
- Expand the Office of Science and Technology Policy (OSTP) national research and development strategy for Big Data to encourage further research into new techniques and tools, and explore the application of those tools to important problems across varied research domains.
- Provide further guidance and greater collaboration with industry and stakeholders on applying the privacy and data protection practices already in place to current technology and cultural realities.
“The characteristics of Big Data will shape the way government organizations ingest, analyze, manage, store, and distribute data across the enterprise and across the ecosystem,” TechAmerica wrote in its report.
Particular to healthcare agencies, there is ample evidence that government officials are paying attention to the big data trend.
“Policy makers are clearly understanding, even more so now than before, that the data can be used in very advanced ways to help drive decisions to reduce costs of care while improving overall health of people,” said Sam Shekar, MD, MPH, chief medical officer in the civil systems division of Health IT at Northrop Grumman Information Systems. “That synergy of HIT and health is really very powerful and we’re going to see more and more of it over the next few years.”
Related Big Data coverage: