The 'cheap' supercomputer in the Amazon cloud

$4,829 per hour isn't "cheap" but a 50,000 core supercomputer would cost up to $25 million.

A cancer research software simulation project run by New York firm Schrodinger relied on 6,742 Amazon EC2 instances to provide 51,132 computer cores. This global effort used Amazon installations in Virginia, Oregon, California, Tokyo, Singapore, Sao Paolo, and Ireland. A total of 58.78TB of RAM was used in the "computer" for the project.

By winning a contest run by Cycle Computing for free supercomputer time, Morgridge Institute for Research was able to perform stem cell research. Existing supercomputer owners do sell time on their systems, but getting 50,000 computers working together takes careful planning, and will not be done regularly. Until, of course, Amazon EC2 scales up some more.

Buy versus rent

It's much cheaper to just budget $100k and book a few weekends with this thing.

jeblucas on arstechnica.com

It ran for only 3 hours, at a total cost of $14,486 - compared to the build cost and lead time involved in building a "$20-25m data center". This has huge implications for the availability of supercomputers for smaller organizations and use cases.

garethsprice on news.ycombinator.com

When people say cloud computing is just a gimmick, they really have no idea what they’re talking about. This is what cloud computing is all about.

cowboydroid on theverge.com

Our department which currently runs a 1000+ core machine and 2500+ core machine for MD simulations is still waiting on jumping onto the Amazon/Cloud bandwagon mostly because it's still not cheaper than owning a cluster for a few years.

michaelgrosner on news.ycombinator.com

If the calculations were so parallel, and it took 50k cores 3hrs, wouldn't it take their home cluster of 1.5k cores ~100hrs (a holiday weekend)?

FirsbeeFreek on arstechnica.com

Long row to hoe

Having been privy to the state-of-the-art projects in a lab with a ~5,000 node cluster it was still disappointing to see how rough the whole cell modeling approaches were.

jboggan on news.ycombinator.com

This is still too expensive for my field.

Aenean144 on theverge.com

Body talk

What needs to be done is a lot of slow, messy "wet bench" biology. There is no electronic shortcut to understanding how living cells work.

junkDNA on news.ycombinator.com

Individual wet-bench work is good to use as prior data, but it's just a hint at a part of a large, complex system, and overly reductive approaches are going to completely miss the big picture.

epistasis on news.ycombinator.com

A single CPU used for the project would take almost 13 years to run the same test.

For the latest IT news, analysis and how-tos, follow ITworld on Twitter, Facebook, and Google+.

Now read this:

Developer declares 'I am done with the Freemium Business Model'

Khan Academy offers JavaScript as their first computer language

Study says Facebook profile can predict job performance

What’s wrong? The new clean desk test
Join the discussion
Be the first to comment on this article. Our Commenting Policies