- Sizing Up Your Cloud Options - Is Now the Time?
- New World Order: Effectively Securing Healthcare Data Through Secure Information Exchanges
- The State of EHR Adoption: On The Road to Improving Patient Safety
- The Power of User Virtualization: Meeting Meaningful Use, Optimizing IT and Clinical Productivity
- Best Practices for Monitoring Data Quality: Improve Database Effectiveness with Accurate Data
Despite all the attention, the catchphrase big data is lacking any kind of a clear definition.
The TechAmerica Foundation last October put together its take and on Monday MeriTalk posted the results of its research into the matter, which involved polling 17 “big data big brains” in the federal government and industry on what exactly big data is, what agencies are doing with it today, and what obstacles remain.
“Most of the respondents shared the view of Big Data as the point at which the traditional data management tools and practices no longer apply,” the Meritalk report explained.
More specifically, big data is “where the data volume, acquisition velocity, or data representation limits the ability to perform effective analysis using traditional relational approaches or requires the use of significant horizontal scaling for efficient processing,” Peter Mell, computer scientist at the National Institute of Standards and Technology (NIST), said in the MeriTalk report.
That is not altogether different than TechAmerica’s definition: "Big Data is a term that describes large volumes of high velocity, complex, and variable data that require advanced techniques and technologies to enable the capture, storage, distribution, management, and analysis of the information."
Getting there will require new personnel and skill sets, and surmounting the existing problems of data silos and ownership as well as IT budgets, MeriTalk noted.
“At the heart of Big Data is an enormous technology challenge — dealing with a data tsunami that will overwhelm organizations not able to take advantage of it,” MeriTalk explained.
Indeed, MeriTalk’s research found that agencies estimate they have about half of the necessary storage and access, 46 percent of the computational power, and 44 percent of the personnel in place to leverage big data — which perhaps helps explain why just 60 percent of federal IT professionals indicated that their agency is analyzing the data they collect, and 40 percent are using it for strategic decision-making purposes.
What’s more, respondents predicted that it would take at least three years for government agencies to reap big data rewards.
“Big data may hold big promise,” MeriTalk’s report suggested, “but until IT solutions are in place to address the explosive growth of data, it is nothing but a big problem.”