The NSA is a data center to house a 512 qubit quantum computer capable of learning, reproducing the brain’s cognitive functions, and programming itself.
The National Security Center is building a highly fortified $2 Billion highly top secret complex simply named the “Utah Data Center” which will soon be home to the Hydrogen bomb of cybersecurity – A 512 Qubit Quantum Computer — which will revitalize the the “total information awareness” program originally envisioned by George Bush in 2003.
The news of the data center comes after Department of Defense contractor Lockheed Martin secured a contract with D-Wave for $10 million for a 512 qubit Quantum Computer code-named Vesuvius.
Vesuvius is capable of executing a massive number of computations at once, more than 100,000,000,000,000,000,000,000,000,000,000,000,000, which is would take millions of years on a standard desktop.
The computer will be able to crack even the most secure encryption and will give the US government a quantum leap into technologies once only dreamed of including the rise of the world’s very first all-knowing omniscient self-teaching artificial intelligence.
The D-Wave Quantum computer boasts of a wide array of features including:
- Binary classification – Enables the quantum computer to be fed vast amounts of complex input data, including text, images, and videos and label the material
- Quantum Unsupervised Feature Learning QUFL – Enables the computer to learn on its own, as well as create and optimize its own programs to make itself run more efficiently.
- Temporal QUFL – Enables the Computer to predict the future based in information it learns through Binary classification and the QUFL feature.
- Artificial Intelligence Via Quantum Neural Network – Enables the computer to completely reconstruct the human brain’s cognitive processes and teach itself how to make better decisions and better predict the future based.
D-Wave’s 512-qubit chip, code-named Vesuvius
D-Wave’s 512-qubit chip, code-named Vesuvius. The white square on the right contains the quantum goodness. Photo: D-Wave
D-Wave Defies World of Critics With ‘First Quantum Cloud’
The quantum computer is the holy grail of tech research. The idea is to build a machine that uses the mind-bending properties of very small particles to perform calculations that are well beyond the capabilities of machines here in the world of classical physics. But it’s still not completely clear that a true quantum computer can actually be built.
But Rose keeps fighting. In May, D-Wave published a paper in the influential journal Nature that backed up at least some of its claims. And more importantly, it landed a customer. That same month, mega defense contractor Lockheed Martin bought a D-Wave quantum computer and a support contract for $10 million.
The critics have been so vociferous in large part because Rose isn’t shy about promoting his company. But that’s just the way he is. Rose likens D-Wave’s quantum computers to the Large Hadron Collider, the world’s biggest particle accelerator. “They’re the largest programmable quantum systems that have ever been built by a long shot,” he says. And his latest pitch is that D-Wave is on verge of unveiling the world’s first quantum cloud. That’s right, quantum-computing-as-a-service.
D-Wave’s computer is designed to solve what are called combinatorial optimization problems. The classic example is figuring out the most efficient route for a traveling salesman going to multiple destinations. There’s no mathematical shortcut that computers can take to solve combinatorial optimization problems. They have to use brute force: Simply check all possible combinations. The trouble is, the number of possibilities explodes exponentially with the problem size. For example, if you have six destinations, there are 64 possible combinations. If you have 20 destinations, there are 1,048,576 possible combinations.
D-Wave’s next-generation computer is designed to handle problems with as many as 512 variables. In theory, that lets you solve problems involving two to the 512 possible combinations, and a problem of that size is beyond the reach of any classical computer that could ever be built. “It’s bigger than the number of atoms in the universe,” Rose says. “It doesn’t matter how big a supercomputer you make.”
He then convinced Lockheed Martin’s management to buy a D-Wave computer and install it in a lab at USC’s Information Sciences Institute. Lockheed Martin and USC split time on the machine, and Lockheed Martin’s access is via a secure network. The machine came online at noon on December 23, and the company now has 50 people working on it.
Via Explaining Quantum Computers:
OK, so quantum computing may sound all very theoretical (and indeed at present a lot of it actually is!). However, practical quantum computing research is now very much under way. Perhaps most notably, back in 2007 a Canadian company called D-Wave announced what it described as “the world’s first commercially viable quantum computer”. This was based on a 16 qubit processor — the Rainer R4.7 — made from the rare metal niobium supercooled into a superconducting state. Back in 2007, D-Wave demonstrated their quantum computer performing several tasks including playing Sudoku and creating a complex seating plan.
Many people at the time were somewhat sceptical of D-Wave’s claims. However, in December 2009, Google revealed that it had been working with D-Wave to develop quantum computing algorithms for image recognition purposes. Experiments had included using a D-Wave quantum computer to recognise cars in photographs faster than possible using any conventional computer in a Google data centre. Around this time, there was also an announcement from IBM that it was rededicating resources to quantum computing research in the “hope that a five-year push [would] produce tangible and profound improvements”.
In 2011, D-Wave launched a fully-commercial, 128-qubit quantum computer. Called the D-Wave One, this is described by the company as a “high performance computing system designed for industrial problems encountered by fortune 500 companies, government and academia”. The D-Wave One‘s super-cooled 128 qubit processor is housed inside a cryogenics system within a 10 square meter shielded room. Just look at the picture here and you will see the sheer size of the thing relative to a human being. At launch, the D-Wave One cost $10 million. The first D-Wave One was sold to US aerospace, security and military giant Lockheed Martin in May 2011.
D-Wave aside, other research teams are also making startling quantum computing advances. For example, in September 2010, the Centre for Quantum Photonics in Bristol in the United Kingdom reported that it had created a new photonic quantum chip. This is able to operate at normal temperatures and pressures, rather than under the extreme conditions required by the D-Wave One and most other quantum computing hardware. According to the guy in charge — Jeremy O’Brien — his team’s new chip may be used as the basis of a quantum computer capable of outperforming a conventional computer “within five years”.
Another significant quantum computing milestone was reported in January 2011 by a team from Oxford University. Here strong magnetic fields and low temperatures were used to link — or “quantumly entangle” — the electrons and nuclei of a great many phosphorous atoms inside a highly purified silicon crystal. Each entangled electron and nucleus was then able to function as a qubit. Most startlingly, ten billion quantumly entangled qubits were created simultaneously. If a way an be found to link these together, the foundation will have been laid for an incredibly powerful computing machine. In comparison to the 128 qubit D-Wave One, a future computer with even a fraction of a 10 billion qubit capacity could clearly possess a quite literally incomprehensible level of processing power.
Details About The NSA Quantum Computer Spy Center
The National Security Center’s massive $2 Billion Dollar highly fortified top secret data center
Watch: Quantum Computers – The Hydrogen Bomb of Cyber Warfare
Watch: Inside The 128 Bit Quantum Computer
Here is a recap of the work the NSA is doing followed by recent technology breakthroughs in quantum physics and detailed overview of what quantum computing.
Cryptogon reports :
Well, it has been the $64,000 question for a couple of decades: Can NSA break something like PGP?
While there might be other black world technologies that could be up to the task (there’s no way to know), what we do know is that a practical quantum computing capability would be, for all intents and purposes, the master key.
I’m pretty confident that NSA has this capability and here’s why: IBM Breakthrough May Make Practical Quantum Computer 15 Years Away Instead of 50. There is no hard constant that one can point to when considering how much more advanced black world technologies are than what we think of as state of the art, but if IBM is 15 years away from building a useful quantum computer, it’s not a stretch to assume NSA has that capability already, or is close to having it.
Bamford lays out a narrative below about the “enormous breakthrough,” but, at the end of the day, it’s conventional computers. There’s no mention quantum computers, or even the far less “out there” photonic systems.
Is Bamford’s piece a limited hangout?
Maybe, but it makes for interesting reading in any event.
Note: For some reason, Bamford refers to Mark Klein as, “A whistle-blower,” without naming him. Because of Mark Klein, we know, for sure, that the mass intercepts are happening, how NSA is doing it, the equipment involved, etc. So, thanks, Mark Klein. Heroes have names on Cryptogon.
The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)
Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.” It is, in some measure, the realization of the “total information awareness” program created during the first term of the Bush administration—an effort that was killed by Congress in 2003 after it caused an outcry over its potential for invading Americans’ privacy.
But “this is more than just a data center,” says one senior intelligence official who until recently was involved with the program. The mammoth Bluffdale center will have another important and far more secret role that until now has gone unrevealed. It is also critical, he says, for breaking codes. And code-breaking is crucial, because much of the data that the center will handle—financial information, stock transactions, business deals, foreign military and diplomatic secrets, legal documents, confidential personal communications—will be heavily encrypted. According to another top official also involved with the program, the NSA made an enormous breakthrough several years ago in its ability to cryptanalyze, or break, unfathomably complex encryption systems employed by not only governments around the world but also many average computer users in the US. The upshot, according to this official: “Everybody’s a target; everybody with communication is a target.”
In the process—and for the first time since Watergate and the other scandals of the Nixon administration—the NSA has turned its surveillance apparatus on the US and its citizens. It has established listening posts throughout the nation to collect and sift through billions of email messages and phone calls, whether they originate within the country or overseas. It has created a supercomputer of almost unimaginable speed to look for patterns and unscramble codes. Finally, the agency has begun building a place to store all the trillions of words and thoughts and whispers captured in its electronic net. And, of course, it’s all being done in secret. To those on the inside, the old adage that NSA stands for Never Say Anything applies more than ever.
The data stored in Bluffdale will naturally go far beyond the world’s billions of public web pages. The NSA is more interested in the so-called invisible web, also known as the deep web or deepnet—data beyond the reach of the public. This includes password-protected data, US and foreign government communications, and noncommercial file-sharing between trusted peers. “The deep web contains government reports, databases, and other sources of information of high value to DOD and the intelligence community,” according to a 2010 Defense Science Board report. “Alternative tools are needed to find and index data in the deep web … Stealing the classified secrets of a potential adversary is where the [intelligence] community is most comfortable.” With its new Utah Data Center, the NSA will at last have the technical capability to store, and rummage through, all those stolen secrets. The question, of course, is how the agency defines who is, and who is not, “a potential adversary.”
According to Binney—who has maintained close contact with agency employees until a few years ago—the taps in the secret rooms dotting the country are actually powered by highly sophisticated software programs that conduct “deep packet inspection,” examining Internet traffic as it passes through the 10-gigabit-per-second cables at the speed of light.
The software, created by a company called Narus that’s now part of Boeing, is controlled remotely from NSA headquarters at Fort Meade in Maryland and searches US sources for target addresses, locations, countries, and phone numbers, as well as watch-listed names, keywords, and phrases in email. Any communication that arouses suspicion, especially those to or from the million or so people on agency watch lists, are automatically copied or recorded and then transmitted to the NSA.
The scope of surveillance expands from there, Binney says. Once a name is entered into the Narus database, all phone calls and other communications to and from that person are automatically routed to the NSA’s recorders. “Anybody you want, route to a recorder,” Binney says. “If your number’s in there? Routed and gets recorded.” He adds, “The Narus device allows you to take it all.” And when Bluffdale is completed, whatever is collected will be routed there for storage and analysis.
According to Binney, one of the deepest secrets of the Stellar Wind program—again, never confirmed until now—was that the NSA gained warrantless access to AT&T’s vast trove of domestic and international billing records, detailed information about who called whom in the US and around the world. As of 2007, AT&T had more than 2.8 trillion records housed in a database at its Florham Park, New Jersey, complex.
Verizon was also part of the program, Binney says, and that greatly expanded the volume of calls subject to the agency’s domestic eavesdropping. “That multiplies the call rate by at least a factor of five,” he says. “So you’re over a billion and a half calls a day.” (Spokespeople for Verizon and AT&T said their companies would not comment on matters of national security.)
After he left the NSA, Binney suggested a system for monitoring people’s communications according to how closely they are connected to an initial target. The further away from the target—say you’re just an acquaintance of a friend of the target—the less the surveillance. But the agency rejected the idea, and, given the massive new storage facility in Utah, Binney suspects that it now simply collects everything. “The whole idea was, how do you manage 20 terabytes of intercept a minute?” he says. “The way we proposed was to distinguish between things you want and things you don’t want.” Instead, he adds, “they’re storing everything they gather.” And the agency is gathering as much as it can.
Once the communications are intercepted and stored, the data-mining begins. “You can watch everybody all the time with data- mining,” Binney says. Everything a person does becomes charted on a graph, “financial transactions or travel or anything,” he says. Thus, as data like bookstore receipts, bank statements, and commuter toll records flow in, the NSA is able to paint a more and more detailed picture of someone’s life.
The NSA also has the ability to eavesdrop on phone calls directly and in real time. According to Adrienne J. Kinne, who worked both before and after 9/11 as a voice interceptor at the NSA facility in Georgia, in the wake of the World Trade Center attacks “basically all rules were thrown out the window, and they would use any excuse to justify a waiver to spy on Americans.” Even journalists calling home from overseas were included. “A lot of time you could tell they were calling their families,” she says, “incredibly intimate, personal conversations.” Kinne found the act of eavesdropping on innocent fellow citizens personally distressing. “It’s almost like going through and finding somebody’s diary,” she says.
Sitting in a restaurant not far from NSA headquarters, the place where he spent nearly 40 years of his life, Binney held his thumb and forefinger close together. “We are, like, that far from a turnkey totalitarian state,” he says.
Meanwhile, over in Building 5300, the NSA succeeded in building an even faster supercomputer. “They made a big breakthrough,” says another former senior intelligence official, who helped oversee the program. The NSA’s machine was likely similar to the unclassified Jaguar, but it was much faster out of the gate, modified specifically for cryptanalysis and targeted against one or more specific algorithms, like the AES. In other words, they were moving from the research and development phase to actually attacking extremely difficult encryption systems. The code-breaking effort was up and running.
The breakthrough was enormous, says the former official, and soon afterward the agency pulled the shade down tight on the project, even within the intelligence community and Congress. “Only the chairman and vice chairman and the two staff directors of each intelligence committee were told about it,” he says. The reason? “They were thinking that this computing breakthrough was going to give them the ability to crack current public encryption.”
Cyrptome further quotes the 4 paged wired article.
The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)
By James Bamford
March 15, 2012
[Excerpts of excellent NSA overview to focus on the MRF decryption facility.]
When Barack Obama took office, Binney hoped the new administration might be open to reforming the program to address his constitutional concerns. He and another former senior NSA analyst, J. Kirk Wiebe, tried to bring the idea of an automated warrant-approval system to the attention of the Department of Justice’s inspector general. They were given the brush-off. “They said, oh, OK, we can’t comment,” Binney says.
Sitting in a restaurant not far from NSA headquarters, the place where he spent nearly 40 years of his life, Binney held his thumb and forefinger close together. “We are, like, that far from a turnkey totalitarian state,” he says.
There is still one technology preventing untrammeled government access to private digital data: strong encryption. Anyone—from terrorists and weapons dealers to corporations, financial institutions, and ordinary email senders—can use it to seal their messages, plans, photos, and documents in hardened data shells. For years, one of the hardest shells has been the Advanced Encryption Standard, one of several algorithms used by much of the world to encrypt data. Available in three different strengths—128 bits, 192 bits, and 256 bits—it’s incorporated in most commercial email programs and web browsers and is considered so strong that the NSA has even approved its use for top-secret US government communications. Most experts say that a so-called brute-force computer attack on the algorithm—trying one combination after another to unlock the encryption—would likely take longer than the age of the universe. For a 128-bit cipher, the number of trial-and-error attempts would be 340 undecillion (1036).
Breaking into those complex mathematical shells like the AES is one of the key reasons for the construction going on in Bluffdale. That kind of cryptanalysis requires two major ingredients: super-fast computers to conduct brute-force attacks on encrypted messages and a massive number of those messages for the computers to analyze. The more messages from a given target, the more likely it is for the computers to detect telltale patterns, and Bluffdale will be able to hold a great many messages. “We questioned it one time,” says another source, a senior intelligence manager who was also involved with the planning. “Why were we building this NSA facility? And, boy, they rolled out all the old guys—the crypto guys.” According to the official, these experts told then-director of national intelligence Dennis Blair, “You’ve got to build this thing because we just don’t have the capability of doing the code-breaking.” It was a candid admission. In the long war between the code breakers and the code makers—the tens of thousands of cryptographers in the worldwide computer security industry—the code breakers were admitting defeat.
So the agency had one major ingredient—a massive data storage facility—under way. Meanwhile, across the country in Tennessee, the government was working in utmost secrecy on the other vital element: the most powerful computer the world has ever known.
The plan was launched in 2004 as a modern-day Manhattan Project. Dubbed the High Productivity Computing Systems program, its goal was to advance computer speed a thousandfold, creating a machine that could execute a quadrillion (1015) operations a second, known as a petaflop—the computer equivalent of breaking the land speed record. And as with the Manhattan Project, the venue chosen for the supercomputing program was the town of Oak Ridge in eastern Tennessee, a rural area where sharp ridges give way to low, scattered hills, and the southwestward-flowing Clinch River bends sharply to the southeast. About 25 miles from Knoxville, it is the “secret city” where uranium- 235 was extracted for the first atomic bomb. A sign near the exit read: what you see here, what you do here, what you hear here, when you leave here, let it stay here. Today, not far from where that sign stood, Oak Ridge is home to the Department of Energy’s Oak Ridge National Laboratory, and it’s engaged in a new secret war. But this time, instead of a bomb of almost unimaginable power, the weapon is a computer of almost unimaginable speed.
In 2004, as part of the supercomputing program, the Department of Energy established its Oak Ridge Leadership Computing Facility for multiple agencies to join forces on the project. But in reality there would be two tracks, one unclassified, in which all of the scientific work would be public, and another top-secret, in which the NSA could pursue its own computer covertly. “For our purposes, they had to create a separate facility,” says a former senior NSA computer expert who worked on the project and is still associated with the agency. (He is one of three sources who described the program.) It was an expensive undertaking, but one the NSA was desperate to launch.
Known as the Multiprogram Research Facility, or Building 5300, the $41 million, five-story, 214,000-square-foot structure was built on a plot of land on the lab’s East Campus and completed in 2006. Behind the brick walls and green-tinted windows, 318 scientists, computer engineers, and other staff work in secret on the cryptanalytic applications of high-speed computing and other classified projects. The supercomputer center was named in honor of George R. Cotter, the NSA’s now-retired chief scientist and head of its information technology program. Not that you’d know it. “There’s no sign on the door,” says the ex-NSA computer expert.
At the DOE’s unclassified center at Oak Ridge, work progressed at a furious pace, although it was a one-way street when it came to cooperation with the closemouthed people in Building 5300. Nevertheless, the unclassified team had its Cray XT4 supercomputer upgraded to a warehouse-sized XT5. Named Jaguar for its speed, it clocked in at 1.75 petaflops, officially becoming the world’s fastest computer in 2009.
1 Geostationary satellites
Four satellites positioned around the globe monitor frequencies carrying everything from walkie-talkies and cell phones in Libya to radar systems in North Korea. Onboard software acts as the first filter in the collection process, targeting only key regions, countries, cities, and phone numbers or email.
2 Aerospace Data Facility, Buckley Air Force Base, Colorado
Intelligence collected from the geostationary satellites, as well as signals from other spacecraft and overseas listening posts, is relayed to this facility outside Denver. About 850 NSA employees track the satellites, transmit target information, and download the intelligence haul.
3 NSA Georgia, Fort Gordon, Augusta, Georgia
Focuses on intercepts from Europe, the Middle East, and North Africa. Codenamed Sweet Tea, the facility has been massively expanded and now consists of a 604,000-square-foot operations building for up to 4,000 intercept operators, analysts, and other specialists.
4 NSA Texas, Lackland Air Force Base, San Antonio
Focuses on intercepts from Latin America and, since 9/11, the Middle East and Europe. Some 2,000 workers staff the operation. The NSA recently completed a $100 million renovation on a mega-data center here—a backup storage facility for the Utah Data Center.
5 NSA Hawaii, Oahu
Focuses on intercepts from Asia. Built to house an aircraft assembly plant during World War II, the 250,000-square-foot bunker is nicknamed the Hole. Like the other NSA operations centers, it has since been expanded: Its 2,700 employees now do their work aboveground from a new 234,000-square-foot facility.
6 Domestic listening posts
The NSA has long been free to eavesdrop on international satellite communications. But after 9/11, it installed taps in US telecom “switches,” gaining access to domestic traffic. An ex-NSA official says there are 10 to 20 such installations.
7 Overseas listening posts
According to a knowledgeable intelligence source, the NSA has installed taps on at least a dozen of the major overseas communications links, each capable of eavesdropping on information passing by at a high data rate.
8 Utah Data Center, Bluffdale, Utah
At a million square feet, this $2 billion digital storage facility outside Salt Lake City will be the centerpiece of the NSA’s cloud-based data strategy and essential in its plans for decrypting previously uncrackable documents.
9 Multiprogram Research Facility, Oak Ridge, Tennessee
Some 300 scientists and computer engineers with top security clearance toil away here, building the world’s fastest supercomputers and working on cryptanalytic applications and other secret projects.
10 NSA headquarters, Fort Meade, Maryland
Analysts here will access material stored at Bluffdale to prepare reports and recommendations that are sent to policymakers. To handle the increased data load, the NSA is also building an $896 million supercomputer here.
Russia Today Reports:
NSA Utah ‘Data Center’: Biggest-ever domestic spying lab
Overview of Camp Williams site before the construction works began. UDC will be located on the west side of the highway, on what was previously an airfield (Image from www.publicintelligence.net)
The biggest-ever data complex, to be completed in Utah in 2013, may take American citizens into a completely new reality where their emails, phone calls, online shopping lists and virtually entire lives will be stored and reviewed.
US government agencies are growing less patient with their own country with every month. First, paying with cash, shielding your laptop screen and a whole list of other commonplace habits was proclaimed to be suspicious – and if you see something you are prompted to say something. Then, reports emerged that drones are being fetched for police forces. Now, the state of Utah seems to be making way in a bid to host the largest-ever cyber shield in the history of American intelligence. Or is it a cyber-pool?
Utah sprang to media attention when the Camp Williams military base near the town of Bluffdale sprouted a vast, 240-acre construction site. American outlets say that what’s hiding under the modest plate of a Utah Data Complex is a prospective intelligence facility ordered by the National Security Agency.
Cyber-security vs. Total awareness
The NSA maintains that the data center, to be completed by September 2013, is a component of the Comprehensive National Cyber-security Initiative. The facility is to provide technical assistance to the Department of Homeland Security, collect intelligence on cyber threats and carry out cyber-security objectives, reported Reuters.
But both ordinary Americans and their intelligence community were quick to dub it “a spy center.”
The Utah Data Center will be built on a 240-acre site near Camp Williams, Utah. Once completed in September 2013, it will be twice as large as the US Capitol. The center will provide 100,000 square feet of computer space, out of a total one million square feet. The project, launched in 2010, is to cost the National Security Agency up to $2 billion
The highly-classified project will be responsible for intercepting, storing and analyzing intelligence data as it zips through both domestic and international networks. The data may come in all forms: private e-mails, cell phone calls, Google searches – even parking lot tickets or shop purchases.
“This is more than just a data center,” an official source close to the project told the online magazine Wired.com. The source says the center will actually focus on deciphering the accumulated data, essentially code-breaking.
This means not only exposing Facebook activities or Wikipedia requests, but compromising “the invisible” Internet, or the “deepnet.” Legal and business deals, financial transactions, password-protected files and inter-governmental communications will all become vulnerable.
Once communication data is stored, a process known as data-mining will begin. Everything a person does – from traveling to buying groceries – is to be displayed on a graph, allowing the NSA to paint a detailed picture of any given individual’s life.
With this in mind, the agency now indeed looks to be “the most covert and potentially most intrusive intelligence agency ever,” as Wired.com puts it.
William Binney, NSA’s former senior mathematician-gone-whistleblower, holds his thumb and forefinger close together and tells the on-line magazine:
“We are that far from a turnkey totalitarian state.”
‘Everybody is a target’
Before the data can be stored it has to be collected. This task is already a matter of the past, as the NSA created a net of secret monitoring rooms in major US telecom facilities – a practice that was exposed by people like William Binney in 2006.
The program allowed the monitoring of millions of American phone calls and emails every day. In 2008, the Congress granted almost impecible legal immunity to telecom companies cooperating with the government on national security issues.
By this time, the NSA network has long outgrown a single room in the AT&T building in San Francisco, says Binney:
“I think there are ten to twenty of them. This is not just San Francisco; they have them in the middle of the country and also on the East Coast.”
Binney suspects the new center in Utah will simply collect all the data there is to be collected. Virtually, no one can escape the new surveillance, created in the US for the War on Terror.
Some data, of course, would be crucial in the anti-terrorism battle: exposing potential adversaries. The question is how the NSA defines who is and who is not a potential adversary.
“Everybody is a target; everybody with communication is a target,” remarks another source close to the Utah project.
Breaking the unbreakable
Now, the last hurdle in the NSA’s path seems to be the Advanced Encryption Standard cipher algorithm, which guards financial transactions, corporate mail, business deals, and diplomatic exchanges globally. It is so effective that the National Security Agency even recommended it for the US government.
Here, the Utah data complex may come in handy for two reasons. First: what cannot be broken today can be stored for tomorrow. Second: a system to break the AES should consist of a super-fast computer coupled with a vast storage capabilities to save as many instances for analysis as possible.
The data storage in Utah, with its 1 million square feet of enclosed space, is virtually bottomless, given that a terabyte can now be stored on a tiny flash drive. Wired.com argues that the US plan to break the AES is the sole reason behind the construction of the Utah Data Center.
The eavesdropping issue has been rocking the US since the Watergate scandal in the 1970s, when domestic spying was eventually outlawed. Nowadays, a lot of questions are still being asked about the secret activities of the US government and whether it could be using the Patriot Act and other national security legislation to justify potentially illegal actions. The NSA’s former employees, who decided to go public, wonder whether the agency – which is to spend up to $2 billion on the heavily fortified facility in Utah – will be able to restrict itself to eavesdropping only on international communications.
Source: Russia Today