Cloud computing can be expensive if you need tens of thousands of CPU hours. As an alternative, scientists are using low-cost resources, such and home computers, to produce the latest research.
A recently released study that found global temperatures may be rising faster than expected, from 2.5 degrees to 5.4 degrees Fahrenheit by 2050, was developed with the help of thousands of PCs.
As many as 50,000 PCs were used to run thousands of simulations to help researchers examine various changes to a climate model originally written for a high-performance computing (HPC) system but adapted to run on home computers.
Daniel Rowlands, a climate scientist at Oxford University and the study's lead author, said the research effort took about 5,000 CPU years of compute time. This is an approximate measurement indicating the amount of work a CPU can accomplish in a year. It doesn't account for differences in processing capability but does give an idea of scale of the effort.
Rowlands said it could have cost more than $1 million to to do the work on a public, commercial cloud. Instead, the researchers did their work on ClimatePrediction.net , which uses the Berkeley Open Infrastructure for Network Computing (BOINC) framework for distributed computing. It is also the same framework used by the Search for Extraterrestrial Intelligence, or SETI@Home system. The SETI project is the largest with more than 3 million registered hosts, according to BoincStats.com .
ClimatePrediction has more than 500,000 registered hosts, although the number of active hosts is about 33,000. The difference reflects the number of people who have signed up versus the number of machines running calculations at any given time.
Rowlands said ClimatePrediction is the only distributed network for climate change research, although NASA is developing its own distributed computing platform called Climate@Home .
Similar to ClimatePrediction, NASA's system will eventually enable volunteers to run climate simulations on their home computers during idle times. The NASA effort, announced more than year ago, is in beta testing and not publicly available. The release date has not been announced.
The BOINC framework has received wide adoption among researchers. Some of the other research projects include Rosetta@home, which investigates protein folding, PrimeGrid@home, which conducts mathematical research, and MilkyWay@home, which creates three dimensional models of our galaxy.
These distributed efforts are called citizen science or volunteer computing. "There are certain types of problems that the public wants to participate in," said Rom Walton, BOINC release manager.
The BOINC software is written in the C and C++ programming language, and all the components make the assumption the backend is a SQL server. There are no minimum PC requirements, and Walton said there are computer users running the software on Pentium III processors, which were introduced in 1999.
Climate applications that run on HPC systems can produce simulations at a much higher resolution than those delivered via a PC. They may be able to model, for instance, the climate changes for a city. But the research in Rowland's paper, which was published this week in the journal Nature Geoscience, was focused on continent-wide changes across the globe. It also ran thousands more simulations on the PCs to conduct its research.
The temperature rise in this research were compared with temperatures from 1961 through 1990. It is an increase that's within the range of warming predicted by the Intergovernmental Panel on Climate Change, but concludes that the warming may be higher than earlier estimates if nothing is done to mitigate greenhouse gases.
"We are completely indebted to our volunteers," Rowlands said of the effort.
Read more about pcs in Computerworld's PCs Topic Center.
This story, "Forecasting a warming world via thousands of PCs" was originally published by Computerworld.