If you have been following the site, you could have learned that there were problems that Graphzilla needed to fix, but now the products with first shipping revision of the GPU are out. Second revision will come in couple of weeks, but we will be talking about cards you can go and buy out today.
Nvidia G84-300 and G84-400
As you can see for yourself, GeForce 8600GT features G84-300 chip...
Truth to be told, GeForce 8600GT would fare far better with the name 8600GS, but Nvidia wanted to leave the GS marking for OEM products, so readers will have to swim in the sea of different naming conventions all over again. 8600GT is a product based on the G84-300 series GPU, and features 32 scalar shaders (which Nvidia loves to call "stream processors"), 16 TMUs (Texture Memory Unit) and eight ROPs (Raster Operations Pipeline). This means the 32 shader units should be enough to fill eight complete pixels in each and every cycle. To make sure 540MHz clocked ROPs will shove out eight pixels per clock, each shader unit is ticking at 1.19 GHz.
... while the top-of-the-line GeForce 8600GTS features G84-400 chip.
G84-400 is a bit of a different beast. This chip boasts the same technical specs as the G84-300, but at different clocks. Scalar Shaders are clocked to 1.45GHz when GPU is clocked at reference 675MHz, and 128-bit memory controller communicates only with GDDR-3 memory at 1GHz DDR or above. When talking about theoretical numbers, 8600GTS will churn out 5.4 billion Pixels and 10.8 billion Texels, while 8600GT cannot push more than 4.3 billion pixels and 8.6 billion Texels.
The Cards themselves
Sparkle's 8600GT comes on a blue PCB
Our sample was provided to us by Sparkle, company that predominately manufactures reference boards for other brands. For its own product, it opted for a custom design. Since all 8600 products are pin-to-pin compatible with 6600 and 7600 (Will those rumours about G84 having a 256-bit controller ever end?), manufacturers like Sparkle gained experience and optimised the ref. PCB design, so this product should come with a decent amount of overclocking headroom. Real overclocking however, will be possible with their Calibre version that will appear in couple of weeks.
This board comes with 256MB of 1.4ns GDDR-3 memory clocked at 700MHz DDR (1.4 GHz), while the GPU works at 540MHz, as you already know. The packaging includes one DVI-to-D-SUB adapter, S-VHS cable, and HD-Out.
Magnificent trio - Gainward Golden Sample, eVGA e-GeForce and Sparkle 8600GTS boards
However, the GeForce 8600GTS is the star of this review. The new board for the $/200 bracket succeeds the 7600GT and, as we know right now, AMD will not have an answer for it. This is the runaway graphics part until HD 2900XL or 2900Pro comes out.
The reference PCB enabled companies to clock the product in 50MHz+ for the GPU and 200MHz for the memory, nicely rising performance for a bit higher price range. Gainward is the company that moved from the reference design the most, and seeing the red PCB once again reminds us of great era of GeForce3 and GeForce4 Ti4200, when Golden Sample boards were the ones to have for great overclocking scores. This is the only card that features a dual-slot cooling with a big and silent 72mm fan in the middle of the board. It goes without saying that all three boards featured silent fans. We hope we will not need to write about silent fans in the future.
EVGA and Sparkle had similar green boards, but clocks were different: eVGA clocked its 8600GTS to a default 675/2000 MHz, Gainward Golden Sample was clocked at 725/2200 MHz and Sparkle was clocked all the way to 729/2213 MHz, taking the pole position of GTS clocks.
Memory used on every GTS card was no other than Samsung's own 1.0ns GDDR-3 chips.
Since these are mainstream parts that support the DirectX 10 API, we originally wanted to bring you testing in DirectX 10 API as well. Sadly, due to driver issues with the Age of Conan beta and lack of time to run Nv's DX10 demos, we decided to postpone the DX10 part of testing for a Second INQpression. We will be bringing you a second look at these cards by the end of the week, using either 32-bit or 64-bit version of Vista. We will test DirectX 10 performance in at least one DX10 app, so that we can show you how GeForce 8500GT/8600GT/8600GTS will perform in API they will be utilising for lot of apps in next couple of months/years.
We have used the same driver as other sites out there, ForceWare 158.16 for Windows XP operating system. This driver was stable in many applications, but we could not test several games that we usuall run. World of Warcraft performs abnormally slow, especially in Vista with SLI enabled. Age of Conan was crashing after last patch was issued, but we hope this will get sorted out soon.
Testing configuration was INQtest #2, featuring following components:
Intel Core 2 Extreme QX6800 at 2.93 GHz
Asetek Vapochill Micro
GeIL PC2-8500/9600 MultiSpec 2x1GB Kit
EVGA nForce 680i
EVGA e-GeForce 8800GTX ACS3
EVGA e-GeForce 8600GTS
Gainward Bliss 8600GTS PCX Golden Sample
Seagate Barracuda 7200.8 250GB x2
OCZ ProXstream 1000W
INQtest #2 had Windows XP Professional SP2 with all of the updates installed until April 15th. Office 2007 Professional was installed as well. For the driver part, we relied on ForceWare drivers 158.16 for the graphics cards and 6.53 for the motherboard. From the rest of utilities software, we installed nTune and ATITool. Games and applications were patched to their latest versions.
Is this really worth it? This was our biggest consideration when it comes to the 8600GTS boards...or should you stick out for a 8800?
For our First Inqpressions, we decided to offer you a comparison with the 8800GTX, the current performance leader. We wanted whether it may be worthwhile to think about upgrading to SLI in the future, and thus limiting yourself to nVidia's nForce chipsets only, or would a single 8800GTS or GTX card fit your needs with an alternate chipset.
The calculation is very simple: 680i board costs around $100 more on average than a brilliant DFI Infinity P965. A second graphics card is an additional $200+ and this is 300 dollars you are saving today and possibly killing performance of your computer in six months time. If two 8600GTS boards perform equal to or better than a single 8800GTX or a GTS, our calculation would not hold water. With the G84 chip having 32 shaders, we had serious doubts that this would happen, so read on.
3DMark shows no surprises there...
Oddly enough, we saw that the CPU score was actually lower on 8800GTX than it was the case with 8600GT/GTS/GTS-SLI. Rest of the test though, shows that Nvidia did a great job with G84 GPU. Seeing 6600 3Dmarks on a mainstream part is really something.
3DMark06 - FillRate Single and MultiTexturing
Very neat performance gains when two GPUs are put together
Fill-rate clearly shows that 8600GTS gives out maximum amount of texels - clock speed wins here.
3DMark06 - Pixel Shader test
Pixel Shader 3.0 in 3Dmark puts every scalar shader unit inside these GPUs into a pixel shader
128 shaders at 1.28 GHz clearly show that having a single GPU can help you out better for pixel shader intensive games.
3DMark06 - Vertex Shader test
...however, in the world of Vertex Shader, two GPUs with 32 "vertex" shaders each and a clock of 1.45 GHz wiped the floor with single high-end GPU. This is especially true in terms of complex shader. Graphs don't lie (we're not responsible if 3DMark or ForceWare drivers fudged up).
Company of Heroes
We hope that DX10 patch will be available soon, so that we can finally compare DX9 vs. DX10...
Company of Heroes just came up with a patch that fixed performance in SLI, so you can see expected scores. Last Friday, that was not the case - with SLI configuration being slower than a single-GPU one. This only goes to show that in some cases, it takes time for SLI support to kick in. Once that SLI patch is issued, boards just start flying...
How to test F.E.A.R. with mainstream cards? Everything maxed out, of course. Great performance by $200 parts.
You can see what happens here - if you want to crank the resolution up a bit, 8800GTX will not slow down until the highest resolutions, while 8600GTS, even in SLI configuration - will just lose performance.
We hope that a new patch for Stalker will not erase savegames while fixing the sluggish performance of SLI and 8800GTX. It turns out that even the 8800GTS will have higher performance than GTX ones. We checked the forums and devs are saying that a patch is currently under development.
We were pleasently surprised by the performance offered by a GPU with 128-bit memory controller. It only goes to show that today, the most important part of performance formula is entirely the output of pixels per clock, and 8600GTS shows exactly that. Thirty-two scalar shaders are keeping the ROPs well and nicely fed, and this board easily outclasses many DX9 GPUs, including products that feature double the theoretical number of pixels/clock (boards with 12 or 16 ROPs just have nothing to do with this beast).
When it comes to overclocking, it turns out that overclocking the boards with default components (no voltmods, no after-market cooling) is ranging between 760 and 780MHz for the GPU and 1.11-1.15GHz DDR for the GDDR-3 memory. The cards do not heat as much as we expected in overclocking mode, so the overclocking potential is not limited by cooling. This only speaks of how well Nvidia executed the transition to 80nm process at TSMC.
Not everything is great - 8600GTS has 32 scalar shaders and even in SLI mode, the boards cannot reach performance of single 8800GTX card, and even 8800GTS will prove faster than G84-SLI. This is a deciding factor when compared to fully-fledged, 96 or 128 scalar shader monsters such as 8800GTS or GTX. Money-wise, spending money for a SLI-capable board and this GPU does not make sense.
But even if the 8600GTS cannot compare to 8800GTS/GTX, these GPUs achieve their primary task: bringing the DirectX 10 API support to mainstream market, a market that is not interested in upgrading or shelling out huge amounts of money. Taking a close look at the results of these first three 8600GTS boards, we can conclude that Gainward offered a unique product but fell short with very same overclocking results as the green, reference PCB-laden Sparkle.
In short, Nvidia did a brilliant job and replaced its 7600GT and 7600GS with rock-solid performing, DX10-compliant parts. These parts are more far apart from the high-end of the past, but give surprisingly good value for money. The 8600GT comes a bit short on expectations, so we would advise you to bite the bullet and opt for a higher performing part. If you are considering multi-GPU configuration, just go straight for glory and get a high-end part. ?
+ DirectX 10 for 149/199 price brackets
+ 8600GTS: Performance in games
+ 8600GTS: Beats 7800/7900/X1800/X1950Pro cards easily
- 8600GT: visible performance difference
- 8600GTS: SLI is not worth it
- 8600GTS: difference between high-end and mainstream is now greater than in the past
When it comes to 8600GT, price difference of 50 somethings between GT and GTS should not be a reason for you not to go for the GTS one. For future games, 8600GT will be a bare minimum, so do not go under the minimum right away.
Bartender's Report - Sparkle 8600GT
Bartender's Report - eVGA 8600GTS
Bartender's Report - Gainward 8600GTS
Bartender's Report - Sparkle 8600GTS
Reviewed and tested by Davor Baksa and Theo Valich
Sign up for INQbot – a weekly roundup of the best from the INQ