Paint-on processors and nanotechnology –

At we pride ourselves on providing articles and opinions related in practical ways to the Linux computing you do every day. But we believe open source software development goes wherever the imaginations of its developers take it. Our new column, Future Computing, stretches our tradition on timely, immediately applicable writing in a new direction. Every month, we'll report here on a news item or research initiative that you probably won't be able to apply for years yet. So join Cameron Laird, a longtime writer for LinuxWorld and SunWorld, as he gazes into the amazing world of tomorrow.

In tandem with this new column, we're launching a new LinuxWorld community discussion. The discussion will be a place where you can toss around your own futuristic notions, learn more about the discoveries presented here, and introduce the community to individuals and creations we may not already know about. You'll find a link to the discussion in the Resources section at the end of the article. --Eds.

Really big clusters

Peer-to-peer distributed computing and clusters are two recurring hot topics in the Linux world. What'll it be like, though, when those technologies truly take root, and we each have not two or ten external processors working for us, but a thousand, or a million? links home

Best of

The Legacy Files

The Penguin Brief

Version Control

Linux links

Linux forums

That's the sort of question Harold Abelson, Gerald Sussmann, and eight alphabetically-sorted more junior coauthors address in their 1999 MIT memorandum, "Amorphous Computing" (reprinted by the Communications for the Association of Computing Machinery earlier this year). Among their conclusions: we'll need new programming models to exploit processors that are individually unreliable and communicate over unreliable channels. It'll be worth it, though, because the marginal cost of each additional processor will be under a penny, and the right kind of design and engineering will give us unprecedented computational power.

Amorphous computing is sometimes called swarm computing to emphasize that a collective result emerges from individual microlevel behaviors with the surprising symmetry of a relocating bee or ant colony. This form of computing is also important for controlling the devices created by nanotechnology. Amorphous computing builds on research into distributed computing models like Jini. It presumably will be fueled by nanotechnology research and will ultimately provide the intelligence for nanotechnology products. And it might well be built with calculating biological molecules like those proposed by "Amorphous Computing" coauthor Tom Knight.

Computing molecules and models taken from biology

What will amorphous computing look like? Advanced fabrication techniques will synthesize processing elements so cheaply that they might be delivered in a paint or wrap. We'll "install" a thousand low-power processors at a time. We're unlikely to program them with the traditional, deterministic, barely-above-assembler languages we now use. Instead, we'll set up the kinds of systems that seem to work well for beehives or schools of fish: individual processors will operate with a few simple rules such as "follow your neighbors, mostly" and "jiggle around occasionally and see if you bump into a better solution." It won't be fatal if a few processors don't work to specification, or if noise in the environment degrades interprocess communications. The collective will still be able to achieve reliable answers from its unreliable parts.

Research into animal physiology also suggests that this approach can work. Insects seem to co-ordinate their six legs and four wings not with a sophisticated master algorithm, but rather with simple, rather autonomous local programs that move each part. The teamwork that leads to efficient movement emerges from the interaction of simple elements.

Abelson and Sussmann are famous for their classic text, Structure and Interpretation of Computer Programs, as well as their related work on the Scheme language, program verification, and other computational methods and books. Does it make sense for people who've invested so much in theories of formal correctness now to focus on inherently indeterminate calculations? Several columns planned for the next months will show that, in fact, there are deep connections between these areas.

All the elements necessary for amorphous computing seem within grasp. It's exciting to imagine what it'll be like to paint more efficient fuel injector controls onto an automobile engine, or put enough processor power into an ear-bud so that a radio receiver can learn on its own to scan for music likely to please its owner.

To learn more

Amorphous computing has so seduced Boston-area engineer Will Ware that he's prepared a masterful collection of Webpages on the subject. It's an ideal place to start your own research into the topic. (And if you find something interesting, come back and share it with us in the discussion forum!) Among the highlights: several working models of amorphous computing that Ware makes available for free download.


ITWorld DealPost: The best in tech deals and discounts.
Shop Tech Products at Amazon