More data isn't always better, says Nate Silver

Top statistician warns that an abundance of data lets statisticians 'cherry pick' data points to get the results they want

By Sharon Machlis, Computerworld |  Software

Big data may seem to promise big insights to users, but more isn't always better, cautions statistician Nate Silver, who became one of America's most well-known faces of data analysis after his FiveThirtyEight blog accurately predicted 2012 presidential election results in all 50 states.

The more data there is, "the more people can cherry pick" data points that confirm what they want it to show, he said.

Abundant data is a notable problem in politics, where many have an interest in the outcome. But it's also an issue in fields ranging from medicine -- where many researchers and journals would rather see studies showing an interesting result than a confirmation of no news -- to earthquake prediction.

It turns out that along with real insight, Big Data can bring "a lot of spurious correlations" -- what appear to be relationships between things that are just random noise, Silver said at the RMS Exceedance conference in Boston today, where RMS announced a new cloud-based RMS(one) risk-management platform..

In addition to writing the FiveThirtyEight blog, now seen at the New York Times, Silver is the author of the book, The Signal and the Noise: why so many predictions fail -- but some don't.

In his presentation, Silver offered four tips for more effectively gaining -- and sharing -- insight from data:

1. "Think probabilistically," he urged. "Think in terms of probabilities and not in terms of absolutes."

Don't be afraid of communicating the level of uncertainty that comes with your predictions -- just as most public opinion polls include margins of error -- even if not all of your audience will understand. Some criticized the FiveThirtyEight conclusions of stating the confidence level Silver had in his election predictions, but conveying uncertainty is "important and good science."

Not doing so can have serious consequences, he noted, such as in 1997 when the National Weather Service predicted a 49-foot flood level for the Red River in Grand Forks, ND. Many in the town were reassured by that, since the city's levees were designed to withstand a 51-foot flood.

Unfortunately, what was not communicated to Grand Forks residents was the likely margin of error based on past forecasts: plus or minus 9 feet. In fact, the river crested at 54-feet and much of the community was flooded.


Originally published on Computerworld |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness