Sunday, February 9, 2014

Snowden used common web crawler tool to collect NSA files

Source: Russia Today

Whistleblower Edward Snowden used “inexpensive” and “widely available” software to gain access to at least 1.7 million secret files, The New York Times reported, quoting senior intelligence officials investigating the breach.

The collection process was “quite automated,” a senior intelligence official revealed. Snowden used “web crawler” software to “search, index and back up” files. The program just kept running, as Snowden went about his daily routine.

“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said.

Investigators concluded that Snowden’s attack was not highly sophisticated and should have been easily detected by special monitors. The web crawler can be programmed to go from website to website, via embedded links in each document, copying everything it comes across.

The whistleblower managed to set the right algorithm for the web crawler, indicating subjects and how far to follow the links, according to the report. At the end of the day, Snowden was able to access 1.7 million files including documents on internal NSA networks and internal “wiki" materials, used by analysts to share information across the world.

Reportedly, Snowden had full access to the NSA’s files, as part of his job as the technology contractor in Hawaii, managing computer systems in a faraway outpost that focused on China and North Korea.

Officials added that the files were accessible because the Hawaii outpost was not upgraded with the latest security measures.

No comments:

Post a Comment

Comment Guidelines: Please be respectful of others at all times. Thanks for reading and thanks for your comments!