MIT researchers produce miracle LED: puts out twice as much power as it takes in
LED saps vibration, heat from surrounding atoms to put out more light than electricity it takes in
Researchers from MIT have unveiled an LED that behaves more like an April Fools joke than a light bulb: It not only give off more energy as light than it takes in as electricity (impossibility No. 1), it also gives off cold (impossibility No. 2).
The two impossibilities actually work together to make one another possible, though possibly not much more explicable to those of us who are not experimental physicists (I'm told watching a lot of Big Bang Theory doesn't count).
Most Light Emitting Diodes (LEDs) we use are tiny – alert, signal or decorative lights on computers, stereos and other electronics, small strings of lights, makeup and (more and more commonly) t-shirts.
Ginned up into full-sized bulbs they're ridiculously expensive but wickedly efficient, making a growth industry out of the research required to make their expense less ridiculous or efficiency more wicked.
LED bulbs use far less electricity per unit of light produced than either incandescent or fluorescent bulbs, but they still produce some heat which, as with most light sources, is considered to be a largely unavoidable waste product.
LEDs are semiconductors built with atomic lattices full of "electron holes" – spots in an atomic structure that would normally hold an electron but, in this case, don't.
When you switch on an LED, its electrons make a beeline for the electron holes, releasing energy as they split from their old spots and head toward the new.
That energy manifests as streams of photons which we see as light. The color of the light depends on the amount of energy it takes to release electrons from their orbits around one nucleus so they can carry a charge elsewhere.
Electrons steal heat to pay for travel to a different electron hole
Normally, if you want something to shine more brightly, you increase the energy pumped into it.
MIT researchers Parthiban Santhanam, Dodd Joseph Gray, Jr., and Rajeev J. Ram discovered that reducing the amount of energy required to release an electron also reduced the amount of light the LED put out, but only by half as much as the amount of energy being cut.
Reducing the power going to the LED by a factor of four, in other words, reduced the light by a factor of 2.
By reducing the energy level more and more, the MIT crew eventually switched the LED's power use from the norm, in which it uses more energy than it puts out, to the opposite.
Reducing the input power to 30 picowatts left the LED shining with 70 picowatts of light, which really seems as if it should violate the kind of laws that end up in nuclear meltdowns or giant lizards rising from the sea to stomp Tokyo.
Not so, apparently.
As the input power levels come down, electrons looking for a new home also look for alternative sources of energy. Atoms vibrate all the time in a pattern called Brownian Movement; the colder they are the less Brownian Movement there is. The hotter the environment is the more they vibrate.
In the MIT LED bulb, researchers ratchet down the electrical power so that electrons wanting a new home have to use vibrations of the atoms around them to get enough energy to move.
By taking that energy they slow down the vibration of other atoms.
Slowing the vibration makes the whole thing slightly colder.
So, by reducing the power far enough, the MIT researchers discovered how to make an LED use the heat and motion in its own mass of atoms to boost its power far beyond the power that went into lighting them.
LEDs that produce more light than the electricity they use, and cool the room, too
The result is still far too dim to be usable, so it may be years before the hardware store carries light bulbs that actually add energy and eliminate heat from any application in which they're installed.
vibrations of the lattice of atoms making up the surface of the LED contribute the energy of that motion to the effort of electrons to escape, slowing their motion slightly as a result.
Reducing the Brownian Motion within and between atoms makes an object colder whether it should be or not.
The end result is a single LED that puts out far more light than it takes in, but doesn't violate any laws of physics by doing so. Rather than having to get the extra energy from the nether regions, it
Like I said, it sounds like an April Fools joke: a light bulb that gives off more energy in light than it takes in as electricity, and reduces the amount of heat in its immediate area, essentially radiating cold.
It may turn out to be a joke, but the editors at the American Physical Society's Physical Review Letters liked the math well enough to publish the paper in the Feb. 27 edition.
It's only a matter of time, now, until you can actually power your computer by shining enough lights on it and working it hard enough to product a lot of heat.
Just don't freeze while you're doing it.
Now read this:
Carbon nanotubes could let mobile devices use humans for power, just like The Matrix
Why quantum computers will be inaccurate: There is more an one kind of nothingness
Real-world superpowers: Eye surgery lets some see well into the ultraviolet