Breakthroughs bring the next two major leaps in computing power into sight

Breakthroughs might make quantum computing, replacement for silicon practical within a decade

By  

One of the best things about covering technology is that you're always on the edge of a completely new generation of stuff that will make everything completely different than it ever was before, even before the last generation made everything different.

"Completely different" always seems pretty much the same, with a few more complications, higher costs and a couple of cool new capabilities, of course.

Unless you look back a decade or two and see that everything is completely different from the way it was then…

Must be some conceptual myopia that keeps us in happy suspense over the future, nostalgic wonder at the past and bored annoyance with the present.

The next future to get excited about is going to be really cool, though.

You know how long scientists have been working on quantum computers that will be incomparably more powerful than the ones we have now because don't have to be built on a "bit" that's either a 1 or a zero? They would use a piece of quantum data called a qubit (or qbit, consistent with everything in the quantum world, the spelling wants to be two things at once), that can exist in several states at the same time. That would turn the most basic function in computing from a toggle switch to a dial with many settings.

Multiply the number of pieces of data in the lowest-level function of the computer and you increase its power logarithmically.

Making it happen has been a trick; they've been under development for 20 years and probably won't show up for another 10.

Teams of Austrian scientists may cut that time down a bit with a system they developed they say can create digital models of quantum-computing systems to make testing and development of both theory and manufacturing issues quicker and easier.

They did it the same way Lord of the Rings brought Gollum to life: putting a living example in front of cameras and taking detailed pictures they could use to recreate the image in any other digital environment.

Rather than an actor, the photo subject was a calcium atom, drastically cooled to slow its motion, then manipulated it using lasers, putting it through a set of paces predicted by quantum-mechanical theory, and recorded the results.

Abstracting those results lets the computer model predict the behavior of almost any other quantum particle or environment, making it possible to use the quantum version of a CAD/CAM system to develop and test new approaches to the systems that will actually become quantum computers, according to a paper published in the journal Science by researchers from the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI).

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness