August 18, 2012, 7:30 AM — Modern graphics cards are intimidating, hulking beasts in a world of increasingly tiny PC components. Most of them are double-wide, occupying two expansion-slot spaces, even though they use only a single physical slot. Many require two power connectors and beefier-than-average power supplies. Their primary audience appears to be serious PC gamers, who use an arcane jargon of their own: frame rates, VSync, antialiasing.
Graphics processing units, or GPUs, are at the heart of these cards, and their sheer physical size and transistor count--some models have in excess of 4 billion transistors--help explain why they consume so much power and require sophisticated cooling systems. The transistor count also suggests why these new graphics chips aren't just graphics accelerators: They improve performance across a broad range of applications.
[ FREE DOWNLOAD: 6 things every IT person should know ]
What's in a Card
Graphics cards are impressive pieces of hardware. Let's take a look at what one of the latest cards, an Nvidia GeForce GTX 680, looks like when it's stripped bare.
The latest-generation midrange ($200 to $350) and high-end ($400 and up) graphics cards usually ship with at least 2GB and sometimes 3GB or more of very fast GDDR5 memory, which has a clock speed of 1500MHz but can clock effective speeds of up to 6000MHz (since the memory can move four data items per clock cycle).
Modern monitor connections are present, too. You'll rarely find a VGA connector, though most cards still ship with a DVI-to-VGA adapter if you really need it. Almost all cards include an HDMI port, and the latest products also offer some flavor of DisplayPort, the latest technology in display connections.
Questions to Ask
Now that you've seen what a card looks like when you rip off its cooling shroud, let's consider the cricual questions you should answer before buying a graphics card. Here are five important ones:
- What types of games do you play?
- What other applications do you run?
- What is your budget for a graphics card?
- What is your monitor's display resolution?
- What is your PC's performance level?
What Types of Games Do You Play?
Some types of games demand more of a graphics card than others do. Here are rules of thumb that generally (but by no means always) hold true.
First- and third-person shooter games, like Spec Ops: the Line, are probably the most aggressive in pushing a graphics card to its performance limits. Newer shooters often take advantage of the latest hardware and software technologies in attempting to deliver the most immersive 3D graphics experience possible.
Strategy games tend to be somewhat less demanding graphically. Even strategy titles that can hammer a graphics card, like Total War: Shogun 2, are much more flexible about their settings and can run well on lower end hardware. Games such as Civilization V may support the latest graphics standards while running on fairly modest hardware.
Roleplaying games vary in their graphics requirements, but most are playable on midrange graphics cards priced at between $200 and $300 range. One exception to this rule is The Witcher 2, an extremely demanding game in part because it tries to do too much with DirectX 9, an older graphics interface. Still, even most action RPGs like Diablo III and hybrid shooter RPGs like Mass Effect 3 run acceptably on modestly priced hardware.
Massively multiplayer online games--particularly free-to-play titles--usually avoid taxing GPUs beyond what most current systems can handle, because they're trying to attract as large an audience as possible.
Indie games of all stripes tend to trail the graphics leading edge, and in many instances you can get by with a lower-cost card when playing them.
Finally, if you mostly play older games, your GPU needs are probably relatively modest.
What Other Applications Do You Run?
GPU-accelerated apps are becoming more common. The earliest use of GPU acceleration for consumer applications was for video transcoding. Applications such as CyberLink's Media Expresso have added support for additional graphics hardware and application programming interfaces (APIs) over time.
Photo- and video-editing applications followed. The latest version of Photoshop CS6 uses OpenGL for most of its rendering and GPU compute for accelerating the filters in the blur gallery. Musemage, developed in China, is a completely GPU-accelerated photo-editing application.
Windows 8 will use GPU acceleration in all 2D rendering, and Microsoft Office 2010 already supports graphics acceleration for Excel and PowerPoint charts. Even games use the GPU for more than just graphics--to handle accelerating physics, fluid dynamics, and special-effects calculations.
Web browsers are taking advantage of graphics cards, too. Google Chrome and Firefox take advantage of WebGL for 3D acceleration. Chrome, Internet Explorer 9, and Firefox use the GPU to accelerate 2D page rendering. If you use your GPU for nothing more than accelerating Windows and Web browsers, modern integrated graphics such as Intel's HD 4000 (included with all mobile Ivy Bridge processors) and AMD's Radeon GPU integrated into all A-series processors are good enough. But if you do more-demanding work, you may want a discrete graphics card. Even then, however, for most normal desktop use, you don't need to spend a bundle; a card priced at between $150 and $250 is definitely good enough. The biggest problem with GPU-accelerated apps is that the concept is still fairly young, and therefore fragmented. Some applications may use Nvidia's proprietary CUDA framework exclusively. Others may use only Microsoft's DirectCompute programming interface, which supports all current hardware, but only on Windows machines. A few applications now use OpenCL, an open interface that works with Windows, Mac OS X, and Linux. However, not all cards have current OpenCL drivers. So you have to confirm that both your graphics card and your application will work with each other. Of course, the CPU can always serve as a fallback, in case GPU acceleration doesn't work for your particular application.
What Is Your Budget for a Graphics Card?
Economic reality may constrain even the most discriminating hardcore gamer. Unless you have unfathomably deep pockets, you'll need to balance your desires versus your wallet. Look for discounts and sales, both online and retail. And bear in mind that your monitor may strongly affect your graphics card choices.
What Is Your Monitor's Display Resolution?
If you have an older, 1680-by-1050-pixel monitor, a good $250 to $300 card is the most you'll need, even if you enable performance-sucking features such as antialiasing.
On the other hand, if you want to run three Full HD monitors in stereoscopic 3D mode, a single high-end card may not be enough.
For most of us, the happy medium lies somewhere between. Today's typical display is 1920 by 1080 (aka Full HD), and a $300 graphics card usually does fine with such a monitor, though you may have to forgo some of the "ultra" detail settings.
That said, higher-resolution displays are likely to become increasingly common in the future. Even today, prices for 2560 by 1440 monitors are gradually sinking toward an affordable level .
What Is Your PC's Performance Level?
If you're running on a PC with a Celeron or Athlon II CPU, you can get some extra graphics oomph by upgrading to a better graphics card, but the improvement will be limited. Your goal should be a balanced system. If you add a GTX 680 to a PC that runs on a 3.1GHz Athlon II processor, the GPU will spend far too much time waiting for the CPU to finish its tasks. And an idle GPU is a sign of wasted money.
You should also check your PC's age. If you're still rocking on an original Core 2 Duo E6400 and a motherboard with PCI Express 1.0, spending more than about $200 on a new graphics card is a bad investment. You'll be much better off if you save your money and put it toward a new system, or at least for a motherboard and CPU upgrade, in the near future.