- Life lessons
- Update: Reider named acting national coordinator, Muntz leaving ONC
- Mostashari: ONC will keep punching above its weight post-HITECH
- Healthcare cloud changing 'with a vengeance'
- Are federal IT initiatives strangling agency networks?
- Mostashari says 'brave souls' built Beacons
- As EHR user fee idea dies, certification questions linger
- Q&A: Montana's CIO on enterprise IT for a smooth MMIS rollout
- Cloud could save health industry $11B, study says
- Your Cloud in Healthcare - How to Use the Cloud to Achieve Greater Business Agility
- Cloud Services Leverage Provider IT Resources and Ensure Continued Service Levels
- Saving Lives Virtually – A Day in the Life of Today’s Physician
- Taming Complexity: A New Solution for In-House Healthcare EDI
- Shadow IT's Impact on the Federal Government
Healthcare stands to reap big rewards from the government's $200 million "big data" project, launched March 29 by the Obama Administration.
Aiming to make the most of the fast-growing volume of digital data, the Obama Administration announced a “Big Data Research and Development Initiative,” pledging to “extract knowledge and insights from large and complex collections of digital data,” to help address the nation’s most pressing challenges.
[See also: Farzad Mostashari: Man on a digital mission]
“In the same way that past federal investments in information-technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use big data for scientific discovery, environmental and biomedical research, education and national security,” said John P. Holdren, assistant to the president and director of the White House Office of Science and Technology Policy.
To launch the initiative, six federal departments and agencies announced more than $200 million in new commitments that, together, promise to improve the tools and techniques needed to access, organize, and glean discoveries from huge volumes of digital data, officials said.
The initiative aims to:
- Advance state-of-the-art core technologies needed to collect, store, preserve, manage, analyze and share huge quantities of data.
- Harness these technologies to accelerate the pace of discovery in science and engineering, strengthen our national security, and transform teaching and learning; and
- Expand the workforce needed to develop and use big data technologies.
Holdren said the initiative is in response to the President’s Council of Advisors on Science and Technology, which last year concluded that the federal government is under-investing in technologies related to big data. In response, OSTP launched a senior steering group on big data to coordinate and expand the government’s investments in this area.
The first wave of agency commitments to support the initiative include:
- National Science Foundation and the National Institutes of Health – "Core Techniques and Technologies for Advancing Big Data Science & Engineering." NIH is particularly interested in imaging, molecular, cellular, electrophysiological, chemical, behavioral, epidemiological, clinical and other data sets related to health and disease.
- National Science Foundation – In addition to funding the big data solicitation, and keeping with its focus on basic research, NSF is implementing a comprehensive, long-term strategy that includes new methods to derive knowledge from data; infrastructure to manage, curate and serve data to communities; and new approaches to education and workforce development.
- National Institutes of Health – "1000 Genomes Project Data Available on Cloud." The world’s largest set of data on human genetic variation – produced by the international 1000 Genomes Project – is now freely available on the Amazon Web Services (AWS) cloud. At 200 terabytes – the equivalent of 16 million file cabinets filled with text, or more than 30,000 standard DVDs – the current 1000 Genomes Project data set is a prime example of big data, where data sets become so massive that few researchers have the computing power to make best use of them. AWS is storing the 1000 Genomes Project as a publically available data set for free and researchers only will pay for the computing services that they use.
- Department of Defense – "Data to Decisions." The Department of Defense is investing approximately $250 million annually (with $60 million available for new research projects) across the military departments to harness and utilize massive data in new ways and bring together sensing, perception and decision support to make truly autonomous systems that can maneuver and make decisions on their own.
- Department of Energy – "Scientific Discovery Through Advanced Computing." The Department of Energy will provide $25 million in funding to establish the Scalable Data Management, Analysis and Visualization (SDAV) Institute.
- US Geological Survey – "Big Data for Earth System Science." USGS is announcing the latest awardees for grants it issues through its John Wesley Powell Center for Analysis and Synthesis. The Center catalyzes innovative thinking in Earth system science by providing scientists a place and time for in-depth analysis, state-of-the-art computing capabilities, and collaborative tools for making sense of huge data sets.