NSA Prism Surveillance – The Future Of Big Data Storage And Security
In recent weeks the media has been reporting heavily about the Prism surveillance program of the National Security Administration (NSA). For a moment, lets leave all the political discussion aside, and take a closer look at the sheer volume of data allegedly being managed by their clandestine data center because it brings some mind-bending mathematical realities to light in terms of scope and scale.
According to an in-depth report by Wired Magazine, the NSA Prism data center consists of “25,000-square-foot halls filled with servers, complete with raised floor space for cables and storage. In addition, there will be more than 900,000 square feet for technical support and administration. The entire site is self-sustaining, with fuel tanks large enough to power the backup generators for three days in an emergency, water storage with the capability of pumping 1.7 million gallons of liquid per day, as well as a sewage system and massive air-conditioning system to keep all those servers cool. Electricity will come from the center’s own substation built by Rocky Mountain Power to satisfy the 65-megawatt power demand.”
That much space and computing power is being harnessed so that the system can process, store and utilize yottabytes of data. A yottabyte is a data size so large that even until recently there was no consensus about what to name the next higher magnitude of data because it wasn’t in any danger of being reached. Even writing out the size of a yottabyte by hand can be tiring – a one followed by twenty-four consecutive zeros.
Eric Schmidt of Google once estimated that the total of all human knowledge from the start of human existence to 2003 could be store as 5 exabytes of information. However, the Internet covers much more than pure knowledge – bringing in far ranging aspects of personal communication, culture, and interaction. According to a report by Cisco, global web traffic is expected to more than quadruple between 2010 and 2015 to reach approximately 965 exabytes of data per year. Keep in mind, a single yottabyte is a million exabytes of data.
If the NSA does manage to collect a yottabyte of information, it would be equal to about 500 quintillion pages of text. A standard piece of paper is 0.09652 mm thick, meaning if you stacked 500,000,000,000,000,000,000 of them, all one on top of the other, your paper pile would be large enough to travel to the moon and back more than 50 million times.
The era of big data is clearly upon us, as we shift away from learning and relearning the same information to a point at which we are able to store, access and utilize information so quickly and reliably that it becomes part of our present perspective, rather than scattered abstracts that we try to gather together all over again and again each tine we need them.
On a commercial level, keeping pace with the growth of data and maintaining the infrastructure to manage or utilize it all efficiently will become one of the most pressing issues facing the IT world in the months and years to come. Finding a colocation solution that provides more than Ping, Power and Pipe from a technology partner committed to staying ahead of the curve will be even more essential to the success of your business than it already is today.
NationalNet brings all of the mission critical services, optimization and support functions your business needs to succeed as part of a fully managed hosting solution that scales seamlessly to suit the ‘exabyte needs’ of high volume clients today, and the ‘yottabyte needs’ that are likely to become commonplace in the online marketplace of tomorrow.