Is too much data a problem for big data?

MGaluzzi

In response to an EFF lawsuit and subsequent requests for specific data, the NSA says that it can’t stop deleted the data it is wanted in the court case because it claims it’s systems are so complex. Clearly the NSA is dealing with a lot more data than most, if not all, companies, but is the pure complexity and volume of big data its Achilles’ heel?

Tags: big data, EFF, NSA
Topic: Big Data
Answer this Question

Answers

2 total
TravisT
Vote Up (4)

That’s the challenge of big data - taking a massive amount of data and applying analytics to extract useful, actionable information. Keep in mind, I’m not a data scientist, but analysts sometimes refer to the “Three Vs of Big Data”:  data volume, data velocity and data type variety. Obviously volume is one of these and is consider a critical component of data analytics. Of course, the greater the variety and volume, the more challenging it can be to work with it. 

 

With respect to the NSA, I’m not buying it. They have a history of not being honest, even when the director is testifying to congress. They also have a history of denying that things are possible, only to have it come out later that not only was it possible, they were actually doing it. Things like intercepting Google’s network traffic comes to mind, for an example. I firmly suspect they are destroying the data that is requested because they don’t want to provide it, not because they lack the technological know-how to do so. Keep in mind that just a week or two ago, in response to an ACLU request to a Florida police department about cell phone spying, just before the records were to be produced, the US Marshal Service deputized one of the local police officers as a “special deputy marshal” then used that as a basis to claim that all records were property of the federal government and removed the records from the jurisdiction.

jimlynch
Vote Up (3)

You may find some of these TED Talks interesting:

Playlist: Making sense of too much data
https://www.ted.com/playlists/56/making_sense_of_too_much_data

Ask a question

Join Now or Sign In to ask a question.
Access to more and better data, and the capability to visualize it in more meaningful ways, will make teachers better able to perform the data analysis they've been doing all along.
IT leaders need to learn how to manage the evolving legal, privacy and compliance issues of SMAC contracts.
Financial institutions use many technologies to fight crime, but much of the work comes too late, focusing on suspicious activity, like uncharacteristic charges or money transfers, after it happens.
Only six weeks after its release, Apple’s new programming language appears to already be on its way to replacing Objective-C
Researchers find that by touching our phones all the time we’re leaving our biological mark on them
A non-programming profession which writes a lot of code looks at which languages can solve its problems the fastest
Companies are focusing more and more attention on building out big data analytics capabilities and data scientists are feeling the pressure.
Text analytics company Luminoso, a 2010 MIT Media Lab spinoff that helps its customers make sense out of unstructured data, has raised a $6.5 million Series A round of funding. The 25-person outfit plans to use the funds for new hires in sales, product management and client services as well as to expand its product line.
In his keynote at Spark Summit 2014 in San Francisco today, Databricks CEO Ion Stoica unveiled Databricks Cloud, a cloud platform built around the Apache Spark open source processing engine for big data.
MapR, which distributes a commercial software platform based off the Apache Hadoop big data management open source project, has secured a $110 million funding round led by Google.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

randomness