As gamers, unless you're a Minesweeper junkie with an epic Peggle jones and no further aspirations, you're going to have to have a dedicated video card in your rig to get any kind of decent graphic performance.
The short answer for why that's the case is: "Math is hard; let's go shopping." However, it's not us saying that, it's the CPU.
I know what you're thinking. "BI, the central processing unit... all it does is math!" Or perhaps, "You cheap bastard, you still owe me five bucks for your share of the beer last week." You're right, of course (about the CPU, anyway...). In terms of raw computational horsepower, a modern processor is a potent customer. However, of necessity, it's also a generalist.
In order to obtain truly prodigious performance, we need to bring in a specialist. That specialist is your video card. Think of it like this -- an Olympic decathlon competitor is in phenomenally good shape, and a world-class performer in ten different events, but in any single one of those, their lunch will be summarily eaten by someone who makes that their sole pursuit.
By employing a dedicated graphics processing unit (GPU), the work can be off-loaded to a device engineered expressly for the purpose of turning ones and zeros into polygons, lighting effects, and all the rest of the visual wizardry that separates Age of Conan from Pong. However, merely taking up this burden is not for the faint of heart (nVidia's latest generation of cards sport upwards of 750 million transistors per GPU, ATI's about 666 million; an Intel Core 2 Quad has about 580 million), nor are all video solutions created equal.
Let's get this out of the way first, and then never speak of it again. Onboard video sucks. There's no indelicate way to put it. Oh, sure, you can soften the blow by saying, "at least it's there, and it's free!" but that's like being happy you had your shoe on when you stepped in some dog poop, instead of being barefoot. At least it sucks a lot less than it used to, and is poised to do some very good things, but it doesn't do them yet.
The "good things" to which I refer are the energy-saving behaviors of AMD's "Spider" platform, working with nVidia's new chipset, the 790FX. This combination will, once it's been fully implemented by motherboard and video card manufacturers, cooperate with an nVidia video card to allow you to operate in low-power mode when you're doing undemanding tasks like surfing, and only fire up the big GPU when it comes time to bring on the horsepower. It doesn't yet team up to add the onboard video processing power to your video card, which is a strange omission, but that's where we stand. Until that comes to pass, onboard video is nobody's friend here, since, even at its best, it's stealing system memory from applications that need it to act as the video buffer.
Rather than wallpaper you with a mind-numbing array of specifications (no, seriously - that's a page only an engineer or a marketing dude could love), the story of ATI's current flagship, their 3800 series (3850, 3870, and 3870 X2) is that, despite a promising feature set when they were launched, they're getting trounced handily by nVidia these days in terms of both price and performance. The video card wars tend to go back and forth, with each of the big two players trading punches for supremacy; and for a long time, the Radeon 9800 Pro was the card to beat. However, resting on one's laurels is the easiest path to second place, and it's incumbent upon AMD/ATI to make a strong showing with their next-gen hardware to try and dethrone nVidia from the catbird seat.
That said, they're making pretty good cards, and if you have one, you're generally going to be in pretty decent shape in the majority of situations, though you won't be able to fire up as many of the special effects and post-processing features and mind-bending resolutions. If you game on a 17" LCD and the attendant 1280x1024 resolution it supports, however, you should be pretty well set for a while. If you've got a motherboard that supports CrossfireX, their dual-card implementation, that might stave off a further round of upgrades.
In the "sanity in naming conventions" department, having run out of name space after the 9800 and stumbling around in the Roman Numeral Wilderness briefly (X1x00), they've found their legs once again, with the 2800 series being topped by the 3800. Naturally, this doesn't quite square with the naming system for their VPU chips themselves, since the upcoming one is code-named Z460 according to their latest press release. I'm just happy it's got a "4" in it, frankly.
Currently offering both the biggest & baddest cards on the market (9800GX2 and 8800GTX) as well as two stellar price-for-performance gaming cards (8800GT and 9600GT), it's small wonder that nVidia is sitting pretty right now. With the latter two cards both being able to keep up with, if not surpass, pretty much all of ATI's offerings for under $200, it's no wonder. Right now, the highest-rez, bleedingest-edge, language-abusingest performance is to be found by strapping something (or a couple somethings) from nVidia into your rig.
That said, there are still games and effects that defy the current generation's ability to turn every setting up to the maximum, partially due to the demands made by DX10/DX10.1, and nVidia's own questionable design choices. Specifically, their move to a 256-bit memory bus with the current gen (8800GT/9600/9800) and the attendant 512MB of memory per GPU has hamstrung high-end performance when it comes to highly memory-intensive tasks, like high-sample anti-aliasing. This has represented a step back from the 384bit/768MB architecture employed on the last-gen 8800GTX, which is why that card still remains highly competitive in both single-card and SLI implementations. Hopefully, nVidia (and ATI) have learned from this misstep, and will broaden the memory pipeline and include more video memory in upcoming offerings.
|Rafe Brox spends an inordinate amount of time annoying people who think they know more than he does. When not causing friends and enemies alike to /facepalm electronically, he can be found extolling the virtues of the weird peripherals in his life, from kettlebells to the Trackman Marble. If you, too, would like to tell Rafe exactly how wrong he is doing it, the target coordinates are rafe.brox AT weblogsinc DOT com.|