KML_FLASHEMBED_PROCESS_SCRIPT_CALLS
 

How Public Organizations Should Treat Big Data

April 22nd, 2014 by - 8,779 views

Though the “only human” argument certainly doesn’t apply to Big Data, enterprises and public organizations often expect too much out of the technology. Some executives are frustrated by results that don’t necessarily correlate with their predetermined business plans, and others consider one-time predictive conclusions to beĀ final. The problem is, there’s no guarantee that analytical results will be “right.”

A government-themed action key

A government-themed action key

Public authorities interested in integrating Big Data into their cloud servers need to understand two things. First, digital information possess no political agenda, lacks emotion, and perceives the world in a completely pragmatic manner. And second, data changes as time progresses. For example, just because a county in Maine experienced a particularly rainy Spring doesn’t mean that farming soil will remain moist — future weather conditions may drastically manipulate the environment.

Benefiting from “incorrect” data
If a data analysis program harvests information from one source over the course of 1 hour and then attempts to develop conclusions, the system’s deductions will be correct to the extent that it accurately translated ones and zeroes into actionable intelligence. However, because the place from which the data was aggregated continues to produce new, variable knowledge, it may eventually contradict the original deduction.

Tim Hartford, a contributor to Financial Times, cited Google’s use of predictive analytics tools to chart how many people would be affected by influenza by using algorithms to scrutinize over 50 million search terms. The problem was, 4 years after the project was underway, the company’s system was disenfranchised by the Center for Disease Control and Prevention’s recent aggregation of data, showing that Google’s estimates of the spread of flu-like illnesses were overstated by a 2:1 ratio.

Taking the good with the bad
Although Hartford exemplified Google’s failure as a way of implying that Big Data isn’t what software developers are claiming it to be, Forbes contributor Adam Ozimek noted that the study displayed one of the advantages of the technology: The ability to reject conclusions due to consistently updated information. Furthermore, it’s important to note that Google only collected intelligence from one source, whereas the CDC was amassing data from numerous resources.

So how does this situation pertain to a public servant? Ozimek claimed that if a large percentage of findings derived from Big Data are false, then officials can learn sooner rather than later. Therefore, important decisions can be made or adjusted according to the deductions. There’s no “right” or “wrong” answer, depending on how conclusions are used.

Where should public servants start?
It’s important that government authorities have the appropriate architecture to gather digital information. Cloud infrastructure offers the scalability and flexibility required to efficiently run a predictive or qualitative analytics program. PublicCEO noted the importance of acquiring an appropriate structure, especially one that is certified by federal entities.

PublicCEO also recommended that officials collect data from sources that inherently produce results. For example, 9-1-1 computer-aided design systems reveal important patterns that help emergency medical technicians, firefighters, and other first responders anticipate the effects of natural or man-made disasters.

Before data is integrated into cloud storage, authorities need to set clearly defined goals — not expectations — of what they’re trying to find. Casting a wide net out into the web and hauling in as much variable information as possible may seem like a good approach, but it can often distort finished reports. Therefore, accumulating relevant information is the best option for those looking to obtain algorithmic conclusions.

Above all, it’s important that public officials don’t anticipate outcomes, but use those conclusions to figure out how constituents could benefit from particular programs or services.

The following two tabs change content below.
The GoGrid Team is committed to bringing you the information, advice, and tools necessary to easily evaluate and deploy a broad range of Big Data technologies and maximize your infrastructure to meet your specific needs.

Latest posts by Team GoGrid (see all)

Leave a reply