Cold Fusion a year later

By Mark Gibbs, Network World |  Data Center, energy

Just over a year ago I wrote about something that was incredibly exciting: A commercial cold fusion power generation system. Assuming it worked as claimed, the cold fusion power generation system would herald huge changes in not only the energy industry, but pretty much every aspect of the global economy.

In IT, practical cold fusion generators could, in theory, power entire data centers for next to no cost and provide power in locations far from the grid. And the existing power grid would, itself, be potentially obsoleted by fusion power generation. As for the impact on transportation ... imagine a car that could be driven from coast to coast several times without refueling for a few cents! That's the promise of cold fusion power.

Note that cold fusion is often termed "Low Energy Nuclear Reaction" (LENR) these days, but I'll stick with cold fusion for this article.

Before I update you on where we've got to in this story, let me first explain the background for those of you who might not be up on the story. Those of you who are "au fait" may care to skip ahead.

The generation of energy by fusion is based on the theory that if you can persuade the nuclei of atoms to "fuse" together such that a new, heavier nucleus is formed you will generate energy ... a lot of energy. The reason for this output is that some proportion of the mass involved in fusion is converted to energy.

Now, there are two ways, at least in theory, to achieve fusion. The most publicized and the technique with vastly more research dollars attached to it is "hot" fusion. Hot fusion attempts to emulate the conditions found in stars and so involves temperatures and pressures that are simply mind boggling.

Hot fusion is the goal of projects such as the National Ignition Facility which, along with the likes of the Large Hadron Collider, are fine examples of "big science." The NIF has cost, so far, in excess of $3.54 billion (the LHC is even more spendy, with a price tag of "$9bn ... as of Jun 2010".

Hot fusion is, if you're a geek, sexy. It involves enormous machines the size of houses, enough power to run a large city, and swarms of lab-coated acolytes to prepare and run the equipment which, to date, has completely failed to generate more power than is put into it, which is the goal (called "over unity").

Cold fusion, on the other hand, is theoretically, fusion that can occur at "normal" temperatures (i.e. room temperature ... although I guess where you set your thermostat makes that a little vague) and "normal" pressures.

Originally published on Network World |  Click here to read the original story.
Join us:






Data CenterWhite Papers & Webcasts

Webcast On Demand

Cloud Knowledge Vault

Sponsor: HP and Intel®

See more White Papers | Webcasts

Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Ask a Question