Planet Eye

Dedicated to the research of the multitude of global issues affecting us all and communicate the truth at the heart of these matters. Dedicated to empowering people to conduct their own research and form their own opinions based on factual information on critical matters which affect all humans and stop relying on mainstream media untruths. Proponent of the American Constitution, Bill of Rights and keeping our Liberties intact.

Translate

Friday, January 2, 2015

Building a panopticon: The evolution of the NSA’s XKeyscore How the NSA went from off-the-shelf to a homegrown "Google for packets."



by Sean Gallagher


Like this prison in Cuba, the NSA has turned the Internet into a place where the watchmen can see all. 


The National Security Agency's (NSA) apparatus for spying on what passes over the Internet, phone lines, and airways has long been the stuff of legend, with the public catching only brief glimpses into its Leviathan nature. Thanks to the documents leaked by former NSA contractor Edward Snowden, we now have a much bigger picture.


When that picture is combined with federal contract data and other pieces of the public record—as well as information from other whistleblowers and investigators—it's possible to deduce a great deal about what the NSA has built and what it can do.


We've already looked at the NSA's basic capabilities of collecting, managing, and processing "big data." But the recently released XKeyscore documents provide a much more complete picture of how the NSA feeds its big data monsters and how it gets "situational awareness" of what's happening on the Internet. What follows is an analysis of how XKeyscore works and how the NSA's network surveillance capabilities have evolved over the past decade.
Boot camp


After the attacks of September 11, 2001 and the subsequent passage of the USA PATRIOT Act, the NSA and other organizations within the federal intelligence, defense, and law enforcement communities rushed to up their game in Internet surveillance. The NSA had already developed a "signals intelligence" operation that spanned the globe. But it had not had a mandate for sweeping surveillance operations—let alone permission for it—since the Foreign Intelligence Surveillance Act (FISA) was passed in 1978. (Imagine what Richard Nixon could have done with Facebook monitoring.)


The Global War On Terror, or GWOT as it was known around DC's beltway, opened up the purse strings for everything on the intelligence, surveillance, and reconnaissance (ISR) shopping list. The NSA's budget is hidden within the larger National Intelligence Program (NIP) budget. But some estimates suggest that the NSA's piece of that pie is between 17 and 20 percent—putting its cumulative budget from fiscal year 2006 through 2012, conservatively, at about $58 billion.


Early on, the NSA needed a quick fix. It got that by buying largely off-the-shelf systems for network monitoring, as evidenced by the installation of hardware from Boeing subsidiary Narus at network tap sites such as AT&T's Folsom Street facility in San Francisco. In 2003, the NSA worked with AT&T to install a collection of networking and computing gear—including Narus' Semantic Traffic Analyzer (STA) 6400—to monitor the peering links for AT&T's WorldNet Internet service. Narus' STA software, which evolved into the Intelligent Traffic Analyzer line, was also used by the FBI as a replacement for its Carnivore system during that time frame.
Catching packets like tuna (not dolphin-safe)


Narus' system is broken into two parts. The first is a computing device in-line with the network that watches the metadata in the packets passing by for ones that match "key pairs," which can be a specific IP address or a range of IP addresses, a keyword within a Web browser request, or a pattern identifying a certain type of traffic such as a VPN or Tor connection.


Packets that match those rules are thrown to the second part of Narus' system—a collection of analytic processing systems—over a separate high-speed network backbone by way of messaging middleware similar to the transaction systems used in financial systems and commodity trading floors.


In the current generation of Narus' system, the processing systems run on commodity Linux servers and re-assemble network sessions as they're captured, mining them for metadata, file attachments, and other application data and then indexing and dumping that information to a searchable database.


There are a couple of trade-offs with Narus' approach. For one thing, the number of rules loaded on the network-sensing machine directly impact how much traffic it can handle—the more rules, the more compute power burned and memory consumed per packet, and the fewer packets that can be handled simultaneously. When I interviewed Narus' director of product management for cyber analytics Neil Harrington last year, he said that "with everything turned on" on a two-way, 10-gigabit Ethernet connection—that is, with all of the pre-configured filters turned on—"out of the possible 20 gigabits, we see about 12. If we turn off tag pairs that we’re not interested in, we can make it more efficient."


In other words, to handle really big volumes of data and not miss anything with a traffic analyzer, you have to widen the scope of what you collect. The processing side can handle the extra data—as long as the bandwidth of the local network fabric isn't exceeded and you've added enough servers and storage. But that means that more information is collected "inadvertently" in the process. It's like catching a few dolphins so you don't miss the tuna.


Collecting more data brings up another issue: where to put it all and how to transport it. Even when you store just the cream skimmed off the top of the 129.6 terabytes per day that can be collected from a 10-gigabit network tap, you're still faced with at least tens of terabytes of data per tap that need to be written to a database. The laws of physics prevented the NSA from moving all that digested data back over its own private networks to a central data center; getting all the raw packets collected by the taps back home was out of the question.
NSA, Web startup style


All of these considerations were behind the design of XKeyscore. Based on public data (such as "clearance" job listings and other sources), the NSA used a small internal startup-like organization made up of NSA personnel and contract help from companies such as defense contractor SAIC to build and maintain XKeyscore. The XKeyscore product team used many of the principles of "agile" development and the so-called "devops" approach to running a Web operation—shipping code early and often, having support staff and developers work alongside each other, and reacting quickly to customer demands with new (and sometimes experimental) features.


Built with the same fundamental front-end principles (albeit with some significant custom code thrown in, XKeyscore solved the problem of collecting at wire speed by dumping a lot more to a local storage "cache." And it balanced the conflict between minimizing how much data got sent home to the NSA's data centers and giving analysts flexibility and depth in how they searched data by using the power of Web interfaces like Representation State Transfer (REST).


XKeyscore takes the data brought in by the packet capture systems connected to the NSA's taps (Update: This technology is code-named TURMOIL) and processes it with arrays of Linux machines. The Linux processing nodes can run a collection of "plugin" analysis engines that look for content in captured network sessions; there are specialized plugins for mining packets for phone numbers, e-mail addresses, webmail and chat activity, and the full content of users' Web browser sessions. For selected traffic, XKeyscore can also generate a full replay of a network session between two Internet addresses.


But rather than dumping everything back to the mother ship, each XKeyscore site keeps most of the data in local caches. According to the documents leaked by Snowden, those caches can hold approximately 3 days of raw packet data—full "logs" of Internet sessions. There's also a local database at the network tap sites that can keep up to 30 days of locally processed metadata.


Only data related to a specific case file is pulled back over the network to the NSA's central database. The rest of the data is available through federated search—a search request is distributed across all of the XKeyscore tap sites, and any results are returned and aggregated.


To ask XKeyscore a question, analysts go to an internal Web front-end on the Joint Worldwide Intelligence Communications System (JWICS), the top-secret/sensitive compartmented information (TS/SCI) network shared by the intelligence community and the Department of Defense. They create a query, which is distributed out across the XKeyscore's approximately 150 sites. These sites include network taps at telecommunications peering points run by the NSA's Special Source Operations (SSO) division, systems tied to the network intercept sites of friendly foreign intelligence agencies, and the sites operated by "F6"—the joint CIA-NSA Special Collections Service, "black bag" operators who handle things like mid-ocean fiber taps.


The kinds of questions that can be asked of XKeyscore are limited only by the plugins that the NSA deploys and the creativity of the query. Any sort of metadata that can be extracted from a network session—the language used, IP address geolocation, the use of encryption, filenames of enclosures—can be tracked, cross-indexed, and searched. When the flow of data past a tap point is low, much of that information can be queried or monitored in near-real time. The only limiting factors are that the traffic has to pass through one of the NSA's tap points and that most of the data captured is lost after about three days.
How much is in there?


Because, like Narus, XKeyscore performs best for high volumes of traffic by "going shallow"—applying a small number of rules to determine what traffic gets captured and processed—the probability that information is being collected that is unrelated to people the NSA is really interested in (and who the agency has FISA warrants and National Intelligence case files for) is fairly high. But there have been steady improvements to the filter hardware that does the collection for XKeyscore.


For the collection points inside the US that collect data that is "one end foreign" (1EF)—that is, between an IP address in the US and one overseas—the SSO deployed a new system in 2012 that it said allows "more than 75 percent of the traffic to pass through the filter," according to information from The Guardian. That means that the large majority of traffic passing through US telecommunications peering points can be screened based on the rule sets used for packet capture. Depending on how wide the aperture of those rules are, that could either mean that the NSA is able to "go deep" on 75 percent of traffic and capture just the information they're looking for (with 25 percent of traffic slipping by untouched), or that 75 percent of traffic is getting dumped to cache to be processed—and is searchable with XKeyscore while it's sitting there.


That's a very important distinction. But since The Guardian has not released the document that quote was from, it's impossible to tell at the moment whether the NSA has improved its ability to respect the privacy of citizens or if it is just indexing even more of their daily Internet lives while hunting for terrorist suspects. 



NSA’s Xkeyscore program targeted visitors to MIT server, Tor Project for enhanced scrutiny | BetaBoston



A new report from noted security researchers — first published in conjunction with German news program Tagesschau — states that the NSA’s Xkeyscore program, which determines who is flagged for enhanced tracking and monitoring, targeted every visitor to a particular MIT server, visitors seeking information on the privacy-focused Tor Project, which is based in Cambridge, and those who simply searched for information on the privacy-enhanced TAILS operating system.


The report was co-authored Lena Kampf, Jacob Appelbaum, and John Goetz, and included what the researchers claimed was source code configuration files from the NSA’s secretive tracking program — reportedly leaked by a source other than Edward Snowden.


The code focused on the MIT server, which hosts an anonymizing email tool, was particularly broad. It was unclear whether or not it included exceptions screening out U.S. persons and other visitors from the so-called “Five Eyes” nations (Australia, Canada, New Zealand, United Kingdom, United States) that have a joint signals intelligence partnership: Some parts of the configuration file targeting other activity seemed to definitely exclude those visitors, but this exclusion code was absent in the MIT server’s parameters.


Instead, the code appears to simply look for all visitors to a certain IP address (128.31.0.34), which includes not only MixMinion, the anonymous email tool, but also gaming libraries and privacy-focused web site materials, according to the researchers.


// START_DEFINITION
appid('anonymizer/mailer/mixminion', 3.0, viewer=$ascii_viewer) =
http_host('mixminion') or
ip('128.31.0.34');
// END_DEFINITION



The segment of the code that screens for individuals interested in Tor is slightly more narrow, explicitly excluding Five Eyes-originating visitors:


// START_DEFINITION
/*
The fingerprint identifies sessions visiting the Tor Project website from
non-fvey countries.
*/
fingerprint('anonymizer/tor/torpoject_visit')=http_host('www.torproject.org')
and not(xff_cc('US' OR 'GB' OR 'CA' OR 'AU' OR 'NZ'));
// END_DEFINITION


But the TAILS parameters were broader, segmenting aside Internet users who simply searched for terms related to the privacy-focused operating system that was used by Edward Snowden:


// START_DEFINITION
/*
This fingerprint identifies users searching for the TAILs (The Amnesic
Incognito Live System) software program, viewing documents relating to TAILs,
or viewing websites that detail TAILs.
*/
fingerprint('ct_mo/TAILS')=
fingerprint('documents/comsec/tails_doc') or web_search($TAILS_terms) or
url($TAILS_websites) or html_title($TAILS_websites);
// END_DEFINITION


As BetaBoston reported earlier this year, the Tor browser is often used by individuals in abusive relationships to help protect their privacy and physical safety, and was originally developed by the U.S. Navy to overcome censorship and spying by other countries.


The NSA has admitted previously that some of its own analysts have used its surveillance tools to spy on romantic interests or former partners.


How much extra scrutiny triggering these newly disclosed filters incur was unclear from the report.


Well known technology author Cory Doctorow wrote that he was given the materials under embargo, and was shocked by what he read:


Tor and Tails have been part of the mainstream discussion of online security, surveillance and privacy for years. It’s nothing short of bizarre to place people under suspicion for searching for these terms.


More importantly, this shows that the NSA uses “targeted surveillance” in a way that beggars common sense. It’s a dead certainty that people who heard the NSA’s reassurances about “targeting” its surveillance on people who were doing something suspicious didn’t understand that the NSA meant people who’d looked up technical details about systems that are routinely discussed on the front page of every newspaper in the world.


One expert suggested that the NSA’s intention here was to separate the sheep from the goats — to split the entire population of the Internet into “people who have the technical know-how to be private” and “people who don’t” and then capture all the communications from the first group.


He said his source indicated that the leaker of the source code was not Snowden, but a second source with access to NSA materials. Bruce Schneier, a security expert with access to the Snowden files and a fellow at Harvard’s Berkman Center, also said he believed there was a second NSA leaker.


The NSA responded to questions from BetaBoston with the following statement:


In carrying out its mission, NSA collects only what it is authorized by law to collect for valid foreign intelligence purposes – regardless of the technical means used by foreign intelligence targets. The communications of people who are not foreign intelligence targets are of no use to the agency.


In January, President Obama issued U.S. Presidential Policy Directive 28, which affirms that all persons – regardless of nationality – have legitimate privacy interests in the handling of their personal information, and that privacy and civil liberties shall be integral considerations in the planning of U.S. signals intelligence activities. The President’s directive also makes clear that the United States does not collect signals intelligence for the purpose of suppressing or burdening criticism or dissent, or for disadvantaging persons based on their ethnicity, race, gender, sexual orientation, or religion.


XKEYSCORE is an analytic tool that is used as a part of NSA’s lawful foreign signals intelligence collection system. Such tools have stringent oversight and compliance mechanisms built in at several levels. The use of XKEYSCORE allows the agency to help defend the nation and protect U.S. and allied troops abroad.


All of NSA’s operations are conducted in strict accordance with the rule of law, including the President’s new directive.


Emails to representatives at the Tor Project and MIT were not returned prior to publication.

by Gregory Ferenstein (@ferenstein)




The National Security Agency thinks we have been misled by The Guardian‘s report of a new tool, XKeyscore, that allows agents to read the content of email and private social media chatter.


“Allegations of widespread, unchecked analyst access to NSA collection data are simply not true,” reads a press release issued by the agency today. “Access to XKEYSCORE, as well as all of NSA’s analytic tools, is limited to only those personnel who require access for their assigned tasks.”


Earlier today, The Guardian released details about the previously top-secret surveillance tool, which reportedly allows authorized analysts to search the name, date, and content of internet communications (picture above). The Guardian argues that this power requires no warrantand was given to scores of analysts, such as their informant, Edward Snowden.


“Our tools have stringent oversight and compliance mechanisms built in at several levels,” continues the report. “Not every analyst can perform every function, and no analyst can operate freely. Every search by an NSA analyst is fully auditable, to ensure that they are proper and within the law.”



However, outspoken critic and Senate Intelligence Committee member Ron Wyden implied that the executive branch has been dishonest in its reporting. After the White House declassified the order requiring Verizon to hand over telephone meta-data, Wyden issued this statement:


“The newly declassified briefing documents released today show that the executive branch repeatedly made inaccurate statements to Congress about the value and effectiveness of the bulk email records collection program that was carried out under the USA PATRIOT Act until 2011. These statements had the effect of misleading members of Congress about the usefulness of this program.”


So, should we believe the NSA? If you trust them.
Posted by Wrkn4thegr8rgood at 6:41 PM
Email ThisBlogThis!Share to XShare to FacebookShare to Pinterest
Labels: NSA, Spying
Newer Post Older Post Home

Popular Posts

  • Who Has Your Back?
    Download the complete Who Has Your Back? 2014: Protecting Your Data From Government Requests report as a PDF . We entrust our most sensitiv...
  • End of Dayz - Sevin with Lyrics
  • Prison Planet.com » Proof: Gun Registration Leads To Confiscation
    Prison Planet.com » Proof: Gun Registration Leads To Confiscation Proof: Gun Registration Leads To Confiscation Gun confiscations be...
  • Did You Know? The Illuminati Capital Astana: Kazakhstan
    Astana: The Illuminati Capital of Kazakhstan As the first new capital of the 21st century, the city of Astana in Kazakhstan has been r...
  • EPIC Freedom of Information Act Cases
    EPIC makes frequent use of the Freedom of Information Act (FOIA) to obtain information from the government about surveillance and privacy po...
  • Deeper Insights:  THE USE OF ELECTRONICS & ELECTRICITY
    Deeper Insights :  THE USE OF ELECTRONICS & ELECTRICITY Developments in fiber optics, computers,  electronic communications, ...
  • Lost and Stolen Liberty for Police State Security
    Lost and Stolen  Liberty for Police State Security By Joachim Hagopian Global Research, May 09, 2014 Based on the May 2014 Sup...
  • Timeline of Secret Government Projects
    from  MindControlForums  Website   Note:  because important web-sites are frequently "here today but gone tomorrow" the foll...
  • Bromide Dominance Theory: Beware of Brominated Vegetable Oil in Sodas
    Coca-Cola to Remove Controversial Drink Ingredient Powerade's fruit punch and strawberry lemonade flavours no longer conta...
  • The Denver International Airport Crossroads of Evil
    For there is nothing covered, that shall not be revealed;  neither hid, that shall not be known."  Luke 12:2 Denver Intern...

Blog Archive

Search This Blog

Labels

  • 2nd amendment (1)
  • Agenda 2030 (2)
  • Aspartame (1)
  • cancer (1)
  • Cellular (1)
  • Cern (1)
  • Chemtrails (2)
  • COVID (1)
  • DARPA (1)
  • Data (1)
  • Democracy (1)
  • EFF (1)
  • Electromagnetic Energy (1)
  • Electronic Frontier Foundation (1)
  • Environment (1)
  • fake news (1)
  • Flu Vaccine (1)
  • Fluoride (1)
  • FOIA (1)
  • Fusion Centers (1)
  • Genetics (1)
  • Geoengineering (1)
  • Guns (1)
  • HAARP (1)
  • Katrina (1)
  • Liberty (1)
  • Lyme (1)
  • Mind Control (1)
  • Music (17)
  • NRA (1)
  • NSA (2)
  • Pandemic (1)
  • paperclip (1)
  • Plum Island (1)
  • Precrime (1)
  • Privacy (2)
  • Project Blue Beam (1)
  • PsyOP (2)
  • Spying (1)
  • Surveillance (3)
  • Tea Tree Oil (1)
  • Technology (1)
  • Tracking (1)
  • Trans-Pacific Partnership Agreement (1)
  • United Nations (1)
  • vaccines (2)
  • Who has your Back (1)
Awesome Inc. theme. Powered by Blogger.