Intel's Sandy Bridge chipset is probably still going to force a major change in the microprocessor market, but the disruption may be put off a bit following Intel's revelation of a major flaw in the chipset that could cause five percent of computers running it to fail within three years.
The big deal about Sandy Bridge was that it incorporated GPU functions that would go a long way toward eliminating the need for add-on graphics cards, mostly from Nvidia, at least for mid-range machines.
Despite the newness of the function and the likelihood that it's the new tech that will fail in a product that incorporates both new and old technology, it's not the PCI Express modules or the Core processor that is flawed.
The flaw is a weakness in the 3Gbit/sec SATA ports that are part of the adjoined "Cougar Point" Intel 6-Series Platform Controller Hub (PCH) chip that handles connections between the main processor to the rest of the devices on the motherboard.
It showed up during testing not at Intel, but at OEMs that were stress-testing the new chipset with heat and high voltage before building them into their own products.
Under pressure, and possibly under heavy flow of data across the SATA connections, those connections can fail at the hardware layer, meaning all the motherboards with the P67 or H67 versions of the PCH have to be replaced.
Intel promises to have replacements on the way out the door in late February and be up to full-volume shipping by April.
It has shipped just short of 8 million Sandy Bridge chipsets, and will have to recall more than 100,000, at a cost of about $700,000.
That's bad financial news, but not horrible for a company the size of Intel.
The problem doesn't affect the Core processor at the heart of the chipset, which means it's not even properly the center of the Sandy Bridge processor, which are or will soon ship as the Core i5, Core i7 and Core i3 chips.
It is a core part of Intel's announced intention to expand way, way beyond its traditional role as a CPU vendor to add not only graphics, but security and a range of low-power and mobility capabilities to its chipsets, presumably giving it the kind of market expansion and competitive advantage it got when it added WiFi capabilities to its Centrino chipsets.
In the process it could cut out competitors like Nvidia -- which is adding CPUs to its GPUs to go at Intel more head-to-head -- and ARM, which already owns the tablet and smartphone market that Intel is trying catch.
No one doubts Intel can become a bigger force in those markets, but it's not a sure thing, especially given the Sandy Bridge collapse as its latest example of how to introduce a major new product.
In general, even if you're not in precision manufacturing, you don't want your big customers to find a big flaw before you do, especially after you've shipped 8 million units. It's just not good customer relations.
Worse, it hurts the out-the-door credibility of what is a major new product, not just a routine next-generation upgrade.
And it casts a shadow on Intel's reliability at a time when its major partner is expanding onto the chip architecture of a direct competitor for the first time, while pressing for far more-powerful versions of Intel's Atom chip to compete against ARM and Intel itself is working on an operating system designed for tablets and smartphones.
It raises the question of whether Intel is trying to rush off in too many directions at once and might be losing control of some of the fundamental capabilities without which even a sophisticated, innovative, precision-manufacturing company can find itself in big, big trouble with both its OEM and end-user companies.
Things like checking to make sure the connection leaving Piece A actually reaches Piece B and isn't likely to break the first time a customer uses them a little too intensely the way they're supposed to be used.
Despite all the market niches and form factors and buzzwords, the upshot of all the tablets and smartphones and virtual servers and cloud computing and SAAS and Web 2.0 developments obsessing the industry right now is that end users don't want to have to worry too much about the actual hardware.
All those things only work if the client hardware is reliable enough that IT architects can effectively forget about them.
That's the part of the deal Intel just goofed up.