Few large organizations are better at collecting data than the federal government, but actually making sense of it all is one challenge they might not be up for. Read Write Web points us to a recent survey with more than 150 IT professionals who work within the U.S. government. They all agree that the capabilities of online data gathering and cheap, reliable storage has vastly increased the amount of information that we have collected in last few years. Unfortunately, only 40 percent of those IT pros say that their agency is even bothering to analyze the data that they have and even fewer are using it to make strategic decisions on a regular basis. Even more worrying is that they also say they are years away from having the tools necessary to properly make use of their massive storage piles, which grow bigger every day. From the press release:
Of those who could accurately estimate their current big data storage capacities, some 57% said it was already too late: The infrastructure is not in place for them to be able to work with what they have, and that includes cloud capacity.
The survey also revealed that the nearly one-third of all the collected data is "unstructured and therefore substantially less useful." In other words, there hasn't been a program invented that can actually let them look at what they have. They don't have the software, the bandwidth, the computing power, or the people to make any sense of it. ("57 percent say they have a least one dataset that has grown too big to work with using their current management tools and/or infrastructure.") Yet, we just keep stuffing more and more of it on to hard drives in the vain hope that someday, some data fairy will magically come along and tell us what it all means.
The irony, of course, is that these IT professionals all agree that the number one reason to collect and analyze "big data" is to make your organization more efficient. We guess "not doing nothing" can be pretty efficient in its own way.