William Binney, a former NSA coder behind some of the surveillance program’s algorithms, is warning that the agency’s interest in mass surveillance is coming at a grave cost in efficiency.
While the agency sees value in taking in any data it can get, “just in case,” sorting through a stockpile of unrelated data is soaking up so many resources that what relevant data they might have is getting less focus.
Binney’s comments mirror warnings in some of the Snowden documents, which show the NSA is also concern about their data collection programs far outpacing their ability to process that data.
Indeed, in March some NSA analysts were asking for permission to collect less data with some of the programs, saying that they are collecting a lot of data with “relatively small intelligence value.”
3 thoughts on “NSA Can’t Make Sense of Masses of Culled Data”
Typical cant see the trees because of the forest, so much data they are swamped as such it is only useful after the fact!
After the Boston Marathon attack there was a release of actual data to the public, what the mother said to whom, etc etc but of course NO bloody use, after the fact except for a conviction of a crime, didn't stop the event taking place!
as for the comment they were collecting a lot of data with little intelligence Value I would go a little further they are collecting a HUGE amount of Data with NO Intelligence Value
Hopefully the servers AND 'THEIR' STUPID HEADS WILL JUST IMPLODE. THIS WILL SAVE AMERIKA BILLION$ AND STOP THE march towards total enslavement in it's tracks..
Once you've 'got' all the information and things keep 'happening', people will naturally start to wonder if you let them happen. Not knowing was always a good excuse.
Comments are closed.