February 9, 2017 | Big Data, CIO Corner, Federal IT, Public Sector
Like the feds, state and local agencies have improved how they share and use big data, but a new report shows areas that still need improvement.
Like their federal counterparts, state and local agencies have made great strides in acquiring and using big data – but they still have a long way to go, according to a source in the industry.
A new report released by Teradata’s Government subsidiary on Tuesday, Jan. 30, called Acing the Big Data Test: How Feds Can Support New Missions With New Insights, gave federal agencies a mixed report card on big data.
MeriTalk, which conducted an in-person and online survey of 100 federal IT managers to look at the fed’s ability to leverage big data and foster data sharing, found 72 percent of those surveyed reported they were in fact improving mission outcomes by leveraging big data.
“A lot of federal agencies were absolutely flummoxed by big data, what they should be collecting, why they should be collecting it, what they should be doing with it,” Ford said, noting that at the state and local level, agencies have begun to warehouse, analyze and open public data streams too – but they still haven’t reached the levels of the feds.
He singled out the state of Michigan, which he praised for its data warehousing capability, pulling data from multiple agencies onto a single platform, then allowing multiple agencies to access it.
Nearly one in five said they were outright not supporting data collaboration across teams; and elsewhere in the report, only 35 percent of feds described themselves as very successful in sharing data across platforms.
He offered a key bit of advice for state and local governments eager to move beyond collecting and warehousing data to analysis: experiment.
Read the full article and the report below.
Source: State and Local Govs Need to Improve Data Sharing, Big Data Use
This entry was posted in Big Data, CIO Corner, Federal IT, Public Sector on .