Unintentionally, AMD, Intel Build Great Cloud Chips

Adding security, graphics, wireless instead of more cores hits real need for virtual apps.

For the last few years, faced with limits on how thin they could make the wires that make up the increasingly complex design of their processors, AMD, Intel and smaller manufacturers quit making monolithic processors that got exponentially more complex with every generation. Instead they started, essentially, just cramming more than one processor onto each chip.

Multicore chips don't scale linearly -- there's some loss of efficiency from overhead and administration to keep the data flowing among two, four or eight cores -- but the difference is pretty small. Multicore chips are a lot more powerful than mono-cores and almost certainly a lot cheaper than similarly powered monos.

Now the CTO of No. 2 chipmaker AMD -- who spent 16 years at Intel, so he knows both sides of this story -- expects the two companies will stop competing to cram more cores on the same chip and instead, build in the ability to handle graphics, cryptography, networking and other functions that currently live elsewhere on the motherboard.

AMD and Intel are both already building GPU capabilities into their next-gen chips, as well as the intelligence to shift processing resources from graphics to data-processing tasks depending on the task at hand.

AMD's "Zacate" Fusion Accelerated Processing Unit (APU) is designed as a low-power chip for netbooks and other small devices, but includes graphics power for full HD video without discrete GPUs and other components required on existing machines.

Intel's, code-named Sandy Bridge, shares cache and memory between the processing core and GPU circuits and uses less energy than earlier chips as well, according to Intel.

Intel recently bought security vendor McAfee and specialty chip-maker Infineon's wireless division to help it add greater security and wireless access functions to the processors themselves, potentially making handhelds, tablets and other small form-factor devices much more powerful and secure.

The main intent of both companies was to eat up more of the total cost of a PC with the processor rather than letting it go to GPU or security makers, just as Intel did when it first began building wireless LAN capabilities into its chipsets for laptops.

The ultimate impact on corporate IT, however, may be to create relatively inexpensive systems that are well designed not only to run movies, but to act as front ends to SAAS, cloud-computing and virtual desktop applications.

Except for configurations that create locked and encrypted VMs on the laptop itself, all three of those fast-growing models run most of their code on back-end servers and use the PC, iPad or smartphone essentially as a screen on which to run movies of the application and platform for buttons with which to control it.

Neither AMD nor Intel intended it when they began R&D on this particular generation of chips -- which will appear in PCs and laptops during the first quarter of next year -- but the combined functions will make all kinds of devices much better clients for IT that takes more and more of the work off the endpoint and puts it back on the server itself.

Insider: How the basic tech behind the Internet works
Join the discussion
Be the first to comment on this article. Our Commenting Policies