On January 6th, Nvidia was claiming that it would be out Feb 14th at a price of "less than $200". Call me overly skeptical, but it missed the date, twice now, and didn't even come close to the promised G92 prices, then lied about the R680 price. Let's just say we don't buy the sub $200 part. Reviewers, we would make sure you see it for that price before you pimp the NV line.... again. In any case they claim a street price of $169-189. But it's on sale here for 188 EURO which, today, is $273(US).
The raw numbers are 64 stream processors, 650MHz core, 1625MHz shaders and 900MHz of 512MB/256b GDDR3. There will be a six-pin power connector, and the board will consume 95W. In the slides, Nvidia claims "Industry's best performance-per-watt", but as far as we know, there is not a single ATI 36xx card with a supplementary power connector, and the one sitting beside me definitely does not have one. We sense a corporate sanctioned fib here.
Then they go on to new PureVideo HD features, the main one of which is called Dual Stream Decode acceleration, basically a software hack to allow the use of extra cycles. YAWN. There are various enhancements to the colour schemes, which we hope for NV's sake work out better than the last few 'enhancements' that didn't do much for any disk other than the HQV benchmark. They still don't claim to do VC1 acceleration, so is there a point to the hand waving?
The funniest part is that they have a slide that says "116% Faster!!!", and immediately follow it with "Single Largest Generational Performance Increase in our History", in green no less. Does anyone else read this as them admitting that the 8600 sucked? Hint: it did.
The raw numbers, taken from a graph seen by The INQUIRER without a ruler handy shows that in FEAR, the 8600GTS hit 30FPS and the 9600GT hits 75FPS. The same cards hit 9/33, 15/32, 8/34, 18/37 and 21/45 in World in Conflict, Lost Planet, Company of Heroes OF, Crysis and Bioshock respectively. All the settings were different and very high, so it looks like they picked cases where the old card ran out of memory or hit a wall that the new one does not. These seem to be very cooked numbers, but considering how bad the 8600 was, they might be real.
They compare the same games to a 3850, and the 9600 wins every one, but then again, they are exploiting the AA weakness in the ATI cards. This page is quite cherry picked, if you ran the same tests with AA off, the NV card would likely get spanked up and down. Then again, NV won't give us hardware any more because we actually question things like this instead of parroting back company quotes like almost everyone else.
On to SLI, they show that in FEAR, the 8600GT hit about half of the 8800GTX's score, 50 vs 89 FPS where the 9600GT hits about 92. Yay. I wonder why they only showed a single DX9 game there, they really tried to rip ATI for DX10 crossfire scaling a few weeks ago, could they be such weasels that they criticized what they don't have either? Quite possibly. We will know in about two weeks.
In any case, it looks like they have fixed the problems of the 8600 more or less. There is still no VC1 acceleration, so don't look for this in many OEM boxes. Before you run out and buy it, we suggest waiting for two things, first is third party benches, the second is where pricing ends up at. In both cases, do not take the official word. µ
Another week of Google news in brief
It was nice knowing you, sort of
Third time unlucky
Customers are unable to make payments or transfer money