IT SOUNDS LIKE Nvidia's G92, the next-gen high-end part, is having heat problems. Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?
More interestingly, the OEMs: several told the same story, said they were given about a week to comply, slap it in a box and FexEx that sucker, ASAP. Other than 'thermal analysis' and 'do it now', no explanation was given. That really made uswonder.
It sounds like a cooling problem, not a die problem. The die itself is far smaller than the ~480mm^2 of the G80,. Those seen by our moles are just over 17*17mm or 289 mm^2 on a 65nm process. If you do the math, (.65 * .65)/(.80 * .80) * 480 mm^2 gives you about what you would expect for a more or less simple shrink with a few tweaks. (Correction: G80 was 90nm, not 80. That puts the math at (.65 * .65) / (.90 * .90) * 480mm^2)
This means the chip will have approximately the power density of a modern CPU, assuming they didn't up the wattage by a lot. This is quite controllable; if ATI could do it on the X2900XT, the G92 should not pose much of a problem.
So, where does that leave us? I am guessing, and this is only a guess, that the cooler they ordered isn't exactly cutting it on production silicon in a real case. I can't think of another reason why they would have to jump through so many hoops so late in the process.
In any case, word should be leaking soon enough, and we will then know if we have another 5800 or 8800 on our hands. One thing for sure, you won't be seeing them in laptops, especially Montevina ones. µ
Sign up for INQbot – a weekly roundup of the best from the INQ