Category Archives: NSA
The NSA’s Secret Campaign to Crack, Undermine Internet Encryption
by Jeff Larson, ProPublica, Nicole Perlroth, The New York Times, and Scott Shane, The New York Times, Sep. 5, 2013, 3:08 p.m.
Note: This story is not subject to our Creative Commons license.
Editor’s Note: Why We Published the Decryption Story
The National Security Agency is winning its long-running secret war on encryption, using supercomputers, technical trickery, court orders and behind-the-scenes persuasion to undermine the major tools protecting the privacy of everyday communications in the Internet age, according to newly disclosed documents.
The agency has circumvented or cracked much of the encryption, or digital scrambling, that guards global commerce and banking systems, protects sensitive data like trade secrets and medical records, and automatically secures the e-mails, Web searches, Internet chats and phone calls of Americans and others around the world, the documents show.
Many users assume — or have been assured by Internet companies — that their data is safe from prying eyes, including those of the government, and the N.S.A. wants to keep it that way. The agency treats its recent successes in deciphering protected information as among its most closely guarded secrets, restricted to those cleared for a highly classified program code-named Bullrun, according to the documents, provided by Edward J. Snowden, the former N.S.A. contractor.
Beginning in 2000, as encryption tools were gradually blanketing the Web, the N.S.A. invested billions of dollars in a clandestine campaign to preserve its ability to eavesdrop. Having lost a public battle in the 1990s to insert its own “back door” in all encryption, it set out to accomplish the same goal by stealth.
The agency, according to the documents and interviews with industry officials, deployed custom-built, superfast computers to break codes, and began collaborating with technology companies in the United States and abroad to build entry points into their products. The documents do not identify which companies have participated.
The N.S.A. hacked into target computers to snare messages before they were encrypted. And the agency used its influence as the world’s most experienced code maker to covertly introduce weaknesses into the encryption standards followed by hardware and software developers around the world.
“For the past decade, N.S.A. has led an aggressive, multipronged effort to break widely used Internet encryption technologies,” said a 2010 memo describing a briefing about N.S.A. accomplishments for employees of its British counterpart, Government Communications Headquarters, or GCHQ. “Cryptanalytic capabilities are now coming online. Vast amounts of encrypted Internet data which have up till now been discarded are now exploitable.”
Read More: The NSA’s Secret Campaign to Crack, Undermine Internet Encryption – ProPublica.
Spying Blind
The National Security Agency has an intelligence problem: It won’t admit how dumb it is.
AUGUST 16, 2013 BY SHANE HARRISThe Obama administration’s claim that the NSA is not spying on Americans rests on a fundamental assertion: That the intelligence agency is so good at distinguishing between innocent people and evildoers, and is so tightly overseen by Congress and the courts, that it doesn’t routinely collect the communications of Americans en masse.
We now know that’s not true. And we shouldn’t be surprised. The question is, why won’t the NSA admit it?
On Thursday night, the Washington Post released a classified audit of NSA’s intelligence-gathering systems, showing they are beset by human error, fooled by moving targets, and rely on so many different servers and databases that NSA employees can’t keep tabs on all of them.
It had been previously reported that the NSA had unintentionally collected the communications of Americans, in violation of court orders, as it swept up electronic signals in foreign countries. But officials had sought to portray those mistakes as limited, swiftly corrected, and not affecting that many people.
Wrong again.
One of the reasons that the NSA has been able to gather so much power is that the agency has built a reputation over the years for super-smarts and hyper-competence. The NSA’s analysts weren’t just the brainiest guys in the room, the myth went; they were the brightest bulbs in the building. The NSA’s hackers could penetrate any network. Their mathematicians could unravel any equation. Their cryptologists could crack any cipher. That reputation has survived blown assignments and billion-dollar boondoggles. Whether it can outlast these latest revelations is an open question.
The Post found that the NSA “has broken privacy rules or overstepped its legal authorities thousands of times each year since Congress granted the agency broad new powers in 2008…” That’s the year when NSA’s global surveillance system went into hyperdrive. The agency was granted unprecedented authority to monitor communications without individual warrants and to surveil whole categories of people and communications.
Most of the violations affecting Americans’ information were the result what the agency calls “incidental collection.” So how many Americans were caught up in the NSA’s surveillance nets as they were dragged across supposedly foreign targets? The exact number is unclear. But the short answer is: lots and lots of them.
In one instance, a programming glitch collected a “large number” of calls from Washington, D.C, instead of the intended targets in Egypt, according to the audit. Somehow, the area code 202 (for Washington) was keyed instead of 20 (the country code for Egypt.) The NSA’s supposedly discriminating surveillance architecture was undone by a typo.
The audit reveals a recurring problem with human error in the day-to-day operations of global surveillance and shows what a messy and imprecise business it can be. In the first quarter of 2012, 123 incidents of non-compliance with the rules, or 63 percent of those examined, were attributed to human or operator error. These included typographical errors, inaccurate or overbroad search queries, and what the report calls “inaccurate or insufficient research information and/or workload issues.”
Analysts needed more “complete and consistent” information about their targets to avoid errors, the audit found. This suggests that while the NSA’s collection systems are dipping into data streams, the analysts aren’t always equipped to determine who is and isn’t a legitimate target.
The NSA’s systems also have problems knowing when a target is on the move, and possibly has entered the United States. (When he does, different regulations come into play about how the surveillance is authorized and what can be monitored without approval from the court.)
As recently as 2012, NSA was not always able to know when targets using a mobile phone had crossed a U.S. border. These so-called “roamers” accounted for the largest number of technological errors in the violations that were examined.
A problem discovered last year, which appears in the report under the heading “Significant Incidents of Non-Compliance,” helps illustrate how NSA is collecting so much information that it can actually lose track of it and store it in places where it shouldn’t be.
In February 2012, the NSA found 3,032 “files containing call detail records” on a server. A call detail record, or CDR, is analogous to a phone bill. It shows whom was called, when, and for how long. This is metadata, like what’s collected today on all phone calls in the United States.
It’s not clear how many CDRs (each representing an individual) were in each of those files. But they were stored on the server for more than five years, past the cut off point at which the information is supposed to be destroyed, pursuant to NSA rules that are meant to protect the privacy of Americans.
How the records got there is a mystery. The report says they were “potentially collected” under business records orders, which are authorized by the Patriot Act. But that’s not certain.
What is known, however, is that the records were stored with information that shouldn’t have been anywhere near them. It came from the agency’s highly classified Stellar Wind program, which covered the warrantless interception of phone calls and emails (not just their metadata) that was secretly authorized by President George W. Bush in 2001. Joining the CDRs and the Stellar Wind records was data from yet another program that was unrelated to the two.
Mixing or “co-mingling” information obtained from different programs, and under different laws or authorizations, is a dangerous practice in the intelligence profession. Information is segregated to restrict and monitor the number of people who have access to it. An analyst cleared to look at CDRs might not be authorized to listen to phone calls intercepted under Stellar Wind. But if it’s all on the same server, he might be able to do just that.
That may have happened in 2011, according to the audit. Some personnel may have been granted access to a cache of information that was recently modified so that they were no longer allowed to look at it. But not all the employees were informed about the change.
Storing different intelligence streams in one place also increases the risk of revealing valuable sources and methods for how it was obtained–a basic violation of intelligence tradecraft. It also it makes it easier to steal. (Just ask Edward Snowden.)
And segregation creates a bulwark against privacy violations. Information about Americans is generally kept clear of foreign intelligence because the rules on how the former can be used and disseminated are stricter.
But infractions and mistakes weren’t always reported to the NSA’s overseers, either in Congress or at the Foreign Intelligence Surveillance Court. Partly that’s because the NSA doesn’t view unintentional or “incidental” collection of Americans’ communications as a violation of the rules. It was an accident, the result of what the agency called in a previously declassified document “problems [that] generally involved the implementation of highly sophisticated technology in a complex and ever-changing communications environment…” Translation: Surveillance is hard. Our computers aren’t perfect. We acted in good faith.
Not that the court can verify if that’s true. In a candid admission to the Post, the chief judge, Reggie Walton, said he and his colleagues must “rely upon the accuracy of the information” the government provides, and that the court “does not have the capacity to investigate issues of noncompliance…”
In one case where the court did curtail a new kind of surveillance, it was only months after learning that it was put in place. The court deemed the still-undisclosed activities unconstitutional, and the NSA had to make changes before it could restart them.
The NSA is also instructing its employees not to provide full information about infractions to Congress, which is supposed to oversee intelligence collection efforts and ensure they comply with the law.
The newly released documents affirm something we’ve long known: the NSA gathers up large amounts of information on foreigners and U.S. citizens and then tries to separate the proverbial wheat from the chaff, with imperfect results. That’s alarming, but from a technological standpoint, understandable.
What members of Congress and the public may find more troubling is that the NSA wasn’t honest about these shortcomings. Officials hid them from the same judges and lawmakers that President Obama recently said were engaged in a rigorous process of checks and balances that keeps electronic spying within the bounds of the law.
Perhaps that system, like the NSA’s data vacuums, could use a tune up.
More on NSA