RECEIVED WITH near-unanimous acclaim, the Geforce GTX460 is bringing Nvidia back into the spotlight, only this time for the right reasons.
Let’s face it. There are a bunch of you reading this article who’ve owned, and probably still do, Nvidia graphics cards. Generally you’ve been happy with what you’ve got, but then Nvidia started recycling its silicon generation after generation and you felt downright cheated. The brand became that awkward friend you couldn’t really support but nor would you give in to peer pressure and turn your back on it.
This has created an interesting situation. Some sensible people will shift between AMD and Nvidia due to sheer price versus performance. Others will outright refuse to move to the 'rival' architecture due to pure animosity. Then there is a last group, a silent lot who are hardcore Nvidia fans who simply shrugged and waited out the storm for Nvidia’s next best thing. These guys and gals were perhaps right in waiting.
As we’d mentioned earlier Nvidia has launched its mainstream GTX460 to the quasi-unanimous nod of approval from reviewers worldwide, both in print and online. So, what’s Nvidia done to get all this love from the media?
Somewhere, somehow, Nvidia decided to take on AMD exactly where it should, with a properly priced card and a great power and performance balance that is cheap enough to rake in some serious dosh for the Green Goblin. The difference is that Nvidia is getting much better yields out of these smaller die chips and it gets to make a lot more on the same wafer. Price can go down by quite a lot while performance is still more than enough for some serious gaming.
We can’t emphasise enough what happens specifications wise. The GF104 chip is about half a Fermi, which means Nvidia solved Fermi problem #2: power consumption (problem #1 was yields). This has resulted in a steep drop in TDP - 150W to 160W - without seeing a severe drop in performance. Which brings us to the card in the box and on the retail shelf.
SKU-wise, there are two GTX460 GPUs right now, and the differences aren’t as shallow as they might first appear on the box.
The punchy GTX460 1GB runs at 675MHz, with 1GB of GDDR5 running at 900MHz on a 256-bit bus, has 336 CUDA cores, 56 texture units and 32 ROPs. It sports 512KB of L2 cache and a slightly higher TDP, 160W, than its sibling.
The other GTX460 768MB is carved from the same silicon but packs just 768MB of GDDR5 at 900MHz running on a slimmer 192-bit bus with fewer 24 ROPs to work out those renders and the L2 cache has been shaved down to 384KB. It’s like having three-quarters of the other card for a $20 saving.
Both cards are dual-slot, so be mindful of your box’s real-estate.
So what does this mean? Well, you can game on both these cards at fairly high resolutions. Well, if you can afford a 2560x1600 screen you can surely afford a GTX480 or GTX470. You get a card that has almost linear scaling in SLI and two of these will perform well beyond the performance level of a GTX480. Yes, for $399 you can get something that tops the GTX480 and has about the same power consumption.
We haven’t seen SLI numbers for the 1GB GTX460 but we presume the can of whoopass will be out of the closet.
It might be tempting to run out and buy a 768MB version due to its lower pricing, but we’ll go out on a limb and say that the 1GB version is what you really want. The performance difference will more than justify the $20 premium on the more powerful card. Anything Nvidia makes below the GTX460 is likely to be derived from its GF104 chip, which doesn’t sound too bad right now, but only the final implementation will tell.
It’s been a long crossing of the desert for the Green Goblin, but it does seem that it has finally gone out and done something that both deep-pocket gamers and cost-conscious geeks will agree upon: A soft-spot card that gives you performance and doesn’t break the bank. Sure, it’s half a GF100, but is that wrong? µ
It's not an event, it's an 'app-ening
Chromebooks just got sexy, baby
Arrangement with Bitpay means crypto-currency can be used for online games
Did someone say Frankenstein's monster?