Notebook graphics have been in a state of flux over the last few years. For a while, the GPU was part of the motherboard chipset that included the front side bus and memory controller. The chipset went away as all three components moved onto the chipset, but there was also the creation of discrete GPUs for notebooks. These were roughly the same chips used in add-in boards for laptop users who wanted the horsepower for gaming.
Fast forward a few years and discrete GPUs seem to be disappearing. ExtremeTech has a fine piece on the case of rapidly disappearing GPU in notebooks. Fewer and fewer mainstream OEMs are bothering any more.
Some of what happened is obvious. Low-end laptops simply went with Intel and AMD offerings, since their CPUs have acceptable on-die GPUs in terms of gaming performance. ET pointed out that if you bought a laptop with a discrete GPU it usually had a more powerful CPU and high resolution monitor, so there was no real mid-range, it was all high-end.
What's happening, I think, is that gamers tried gaming laptops and didn't like them. They were very expensive compared to desktops with the same specs and they certainly weren't fun to take on the road or use in your local Starbucks. They tended to be very heavy and ran hot.
Also, gaming changed. In the 1990s, I used to go to LAN parties where 30 people brought their own PCs to the office of a tech consultancy (who would later go on to launch GameSpy). Lugging a tower PC and 19" monitor out of my apartment, down to Orange County and up a flight of stairs to a second floor office was not fun.
But the time of players gathering to game together has passed. Most everyone has broadband and uses headsets and chat software like TeamSpeak, so we don't need to assemble in one room to play League of Legends, Team Fortress or Dota 2.
If people have the money and interest in a gaming PC, they will make the plunge on a high-end PC. That's why Dell's Alienware unit, Origin and Falcon Northwest continue to operate on the Ferrari model. They don't sell a lot but they sell high-priced, high margin products.
One step below the top-of-the-line vendors is usually home brews, people like me who buy their components from NewEgg, Fry's Electronics or Micro Center (or all three) and build their own. Those people tend to go as high end as they can.
The average age of a PC gamer is 35. People at that age 35 have the income for consoles and/or high-end PCs and they won't bother with a laptop. The trend in laptops is light and power efficient, two things you don't get with a discrete GPU and powerful CPU. The trend in tower desktops is performance.
Budget gamers will be happy with what a CPU with a built-in GPU offers. Or they will stick to the iPad for games. But for serious gaming, the trend is swinging back from consoles to the PC.
The one thing I could not find were benchmarks showing the difference. There are plenty of benchmarks to show discrete graphics beats integrated, which is no surprise. But no one seems to have done a power comparison. Clearly it's an issue. Why else would Nvidia introduce Optimus and AMD offer Dynamic Switchable Graphics, which let you switch between discrete and integrated GPUs?
In the end, it looks like discrete GPUs in laptops, which not a bad idea, never caught on because they weren't needed. Gamers who would need a discrete GPU for the games they like used desktops, not laptops.