The Inquirer-Home

The INQ grills 8800GT with DX10 hit-titles

First INQpressions Gainward Geforce 8800GT 512MB
Mon Oct 29 2007, 14:13

Product: Gainward Bliss 8800GT PCX 512MB
Price:
249 EUR, 134.99 GBP, USD 249.99
Web:
www.gainward.net


THERE HAS BEEN more said about the G92 than any mainstream part from Nvidia since the times of Geforce4 Ti4200. The battle between it and ATI's upcoming RV670, or Radeon 3800 series in warming up.

These two parts are the first GPUs to come from a die-shrink of previous high-end generation with fixes of on-die bugs, a more power-efficient design and, of course, higher clocks. It is instructive to compare 3DMark06 scores between the Geforce 8800GTX and 8800Ultra.

Chip
alt='g92_gpu'G92 in all of its glory.

The chip is known as the G92 200 series, and features 112 shaders, a 256-bit memory controller and VP2 (PureVideo HD, same logic as 8600 series). After all the fuss with NVIO chip, Nvidia fixed the video part of the GPU and now the chip can do 2560x1600 with HDCP enabled.

The chip is manufactured in 65nm process at TSMC, and in conjunction with 512MB GDDR3 memory, it will eat no more than 110W. It comes at a default clock of 600MHz for the GPU and 900MHz (1.8 GT/s) for the memory. The 112 SP (Shader Processors, as everybody likes to call them) work at 1.5GHz - faster than 8800GTS, GTX and the Ultra models. At default clock settings, the 8800GTS should have shaders running at 1.2GHz, 8800GTX comes at 1.35GHz, while 8800Ultra sports 128 shaders at the same clock, 1.5GHz (We have seen some ultras working at 1.48GHz, some at 1.61GHz).

The Board
alt='magnificent-four'

Magnificent four - the smallest board is one deadly piranha

The board takes its design from two others on the market: Geforce 7950 GT and 8800GTS. The power comes down significantly with a 65nm chip, so Nvidia was able to produce a single-slot unit which it calls a "more elegant 8800GTS".

The board comes with two dual-link DVI-I connectors, and a classical S-VHS/HD out. Just after the connectors, you can see an empty place for Displayport chip, meaning this board will exist in Displayport variant as soon as market conditions allow - that is, when an OEM orders first batch of cards to be combined with its next-gen displays with Displayport connector. Can you say which from Dell/HP/Acer?

alt='g92-vs-8800gts'
Successor is much more elegant and far lighter - the difference between 90nm and 65nm GPUs is obvious.

The size of the board is identical to 8800GTS, the nine-inch calibre (22.8 cm), equipped with a single-slot cooler. Memory is cooled through aluminium part, while GPU is joined with copper part - this is a classic aluminium/copper combination, no great surprises there. In fact when you compare this board to its current competitors, Geforce 8800GTS/GTX series and Radeon HD 2900XT series this is feather versus heavy-weight fight, but the winner was not so clear cut.

In the middle is the G92 chip, with eight memory chips in a configuration we saw first with Matrox Parhelia, followed by Geforce 5900 and practically every other card with the 256-bit interface.

The internal edge of the board brings a 6-pin PEG power supply connector, and traditionally for Nvidia, a ton of analogue circuitry. We're disappointed with this call, since ATI made a switch to Volterra parts - with new software apps coming from the guys at Markham, overclocking and overvolting on the fly is a matter of reality.

We were fans of DVRM (as Iwill used to call it) or Digital PWM. The amount of changes you could do in real time is insane, and it is easier to provide almost perfect power parameters. We hope that Nv will learn a lesson by the time the NV60 or G100 is out. The NV50/55 generation of products is, sadly, all analogue.

Even though we have couple of cards, we haven't tested SLI due to the fact that there have been some problems and Nvidia told other reviewers that SLI will come with either retail versions of the game or with patches after the titles have been released. Same thing applies to driver support.

Gainward's box comes with bundled Tomb Raider game, but somehow we feel that every owner of this card will go and jump to grab the Crysis/Hellgate/UT3/World in Conflict titles, and the rest of upcoming Yuletide game frenzy.

Our setup was our old, trustworthy INQtest #1, consisted out of following components:

Intel Core 2 Extreme QX6800 @2.93 GHz
Asetek VapoChill Micro cooler
EVGA 680i motherboard
2GB Corsair Dominator PC2-9136C5D
Nvidia GeForce 8800GT 512MB/Zotac 8800GTX AMP!/XFX 8800Ultra/ATI Radeon HD2900XT
250GB Seagate Barracuda 7200.10 16MB cache
Sony BWU-100A Blu-ray burner
Hiper 880W Type-R Power Supply
Toshiba's external HD-DVD box (Xbox 360 HD-DVD drive)
Dell 2407WFP-HC
Logitech G15 Keyboard, MX-518 rat

Software-wise, we're running a dual-boot between 32-bit Windows XP and 64-bit Windows Vista. This is an ideal setup for the next year to year and a half. Rely on your trustworthy WinXP until Microsoft fixes all the things that went to hell in development of Me II, known as Vista. Whoever thought that putting links to websites (IE7 history) in Windows Explorer while you're browsing files - moronic call, to say the least.

We had experienced some serious instabilities of the card with Nvidia-recommended drivers, so we defaulted to older 167.27 drivers. This cost us some performance in Crysis, but when it comes to testing, we can only accept benchmark sessions with no BSODs flying all over the place. A product should not come to market if it runs beta software. Are you buying a car with beta version of tyres, with a tag: "Please come back next month for real ones"? This whole fragile state of PC industry has left PC gaming where it is right now. But, enough ranting, time to see some real numbers.

Performance: Synthetic
alt='3dmarkpcmark'

3DMark shows that GeForce 8800GT clearly has more shader power than GTS, even touching 8800GTX.

As we stated in the title, we're going to grill the 8800GT in DirectX 10 applications that are going to be a huge hits, or already are. Bit for starters, we took two synthetic benchmarks. You need both in order to see what kind of performance can be achieved. When we look at performance achieved in 3DMark06, we can only wonder what owners of 8800GTX and Ultra will think. In this synthetic benchmark, 8800GT scored 12768 3Dmarks, or higher than default clocked 8800GTX. Of course, XFX, EVGA and Zotac brought higher-clocked parts to the market, but when looking at default clocks, it is a small miracle to see the score this high for a mid-priced $249 part.

alt='fillrate'

While the 8800GTS has the lead in single texturing, 8800GT stomps it in multi-texturing.

Fill-rate scores bring the reason for score this high - when it comes to single-texturing, this card scored really well, same applies for multi-texturing.

We included video results from PCMark Vantage, since authors of this ben chmark were in cahoots with Remedy - PCMark sports several tests which are based on an unreleased DX10 real-time strategy game, Alan Wake and of course, updated Forest demo from 3DMark06.

Performance: Real World, DirectX 10 and top-titles
Here comes the real meat. Originally, we planned to run this review last week, around Wednesday. But, drivers we had were not doing their job without BSODs (we wonder how many sites will mention issues with 167.xx and 169.01 drivers), so it took us some time to find the combo that works best. When it comes to the DirectX 10 portion, we tried to offer a good balance between known DX10 benchmarks as Call of Juarez (we used the retail game plus DX10 Enhancement Pack) and PT-Boats as "DX10-first-timers", and upcoming mega-titles such as Crysis, Hellgate: London and UT3, in their respective demo forms, available for download from the Interweb.

Call of Juarez (w/DX10 Enhancement Pack)
alt='callofjuarez'
Whatever happened to DX10 and this game, one thing is certain: benchmark is slow.

Nvidia openly criticised this benchmark on numerous occasions, and the war between the Polish developer and this company continued after Nvidia stated to members of the press and channel that this game has tricks to lower Nvidia scores.

If we take a look in a different light, we might say the hardware is not optimised for this game. We won't go into politics of this sorry affair, but we cannot omit the fact that this title is one of very rare TWIMTBP titles that left the Nvidia camp and became Daamit's GITG (Get In The Game) one. But if you own a game or if you buy it, download t he 850+ MB DX10-1.1.1.0 Enhancement Pack and the game will flourish with DirectX 10 visual goodness. We have used the in-game benchmark here.

alt='callofjuarezaaaf'
If you turn on AA and AF, end result will be slideshow.

Call of Juarez will be pressed hard to give a decent amount of performance if you do not own a computer from 2009. We're not sure what happened here, but all the cards tested today had some problems with this game and ran the fly-through really slow. This includes both Nvidia and ATI cards, so Nvidia's cry foul over scores seems unjust.

Company of Heroes... and crashes
Company of Heroes has been a benchmark from hell. The patching system was a mess, but luckily, Relic managed to sort things with two updates, so now you won't need to download and/or install five-six-seven different patches. You only need 100 or so MB heavy 1.0.0. to 1.4.0 and 1.8GB heavy 1.4.0 to 2.0.1.

We used in-game runtime to see the difference between four tested cards in DX10 mode. This benchmark shows all the deficits of fly-through benchmarking, since the game was never planned to be shown this close. Performance goes to the basement with all cards, but DX9 mode shows that 8800GT is a more-than worthy competitor. The reason why we're omitting this title was constant freeze of benchmark with all the drivers we had in possession. In-game timedemo worked flawlessly in Windows XP, but was falling apart in Vista. Runtime was crashing all the time in different parts of CoH, and we could not get it to work. We even listened to advice and cut down on World Detail, since game placed a warning that a crash might occur. But it was to no avail, because our luck with CoH and DX10 mode was not meant to be. In DX9 mode however, we got some interesting scores. We tried reinstalling the game, but same scenario occurred. Oddly enough, the game itself works like a charm.

Crysis (64-bit, DirectX 9 and 10)
alt='crysis'
Crysis with High details leaves no prisoners - game looks awesome, but it requires a ninja PC on this settings

First of all, Crysis knows the difference between two and four cores, so if AMD does not deliver Phenom by the time this game ships, it can wave its " Customer Centric" mantra good-bye. The amount of physics and foliage is just insane and walking in the jungle and swimming into the sun is just incredible. This is Far Cry gone crazy, and things will get even crazier when everything starts being frozen.

The game comes with 32-bit and 64-bit executables, and will be able to use more than 4GB of memory with no major problems. If you think that having 4GB is excessive right now, you probably will not like to play Crysis. We will run a special article detailing performance of mainstream and high-end cards on Dual-core and Quad-core CPUs, 2GB vs. 4GB and 32-bit vs. 64-bit version, but only when we receive final version of the game. This benchmark is based on fly-by mode in SP Demo, thus some performance differences between this title and final version may occur. A patch is also in the works, since SLI support is announced for first patch and newer driver revision.

alt='crysis-dx10'
DX10 mode brings the eye candy, but also kills the framerate.

If you select "Very High" mode, you have selected DirectX 10 version of the game, with all the bells and whistles, most notably Motion blur during sudden movements. On High, you're running DirectX 9 setting, but it will still stress out your system like no other app.

When it comes to 8800GT vs. 8800GTS (both 320MB and 640MB), we were shocked to see that Crysis is recommending DX9 spec (High) for GTS and DX9 (Very High), but starting the benchmark gave us a reason for that. 96 shaders inside 8800GTS at 1.16GHz are a sitting duck for 112 units at 1.5GHz.

Hellgate: London
alt='hellgate-noaaaf'
This game brings really high framerates.

Hellgate in its demo version comes without DX10 version, but we still wanted to include it in this review. Since Hellgate does not come with integrated timedemo or walkthrough, we did manual runs which tortured GPUs with fire and haze effects, large number of models on the screen and so on. This was the only game that runs nicely on 5800 and 5900 cards, which were lacking serious horsepower.

alt='hellgate-4aa16af'
Framerates stay very high even with AA and AF pushed on.

This game works with highest details and highest-resolutions with no problem. You can run Hellgate at 1920x1200 with 4xAA and 16xAF and framerates will not dip below 33.27, at least during our testing. Average was around 48fps on 8800GT.

PT-Boats: Knights of The Sea
alt='pt-boats'
Open sea it may be, but something was seriously wrong if you enable AA/AF...

We have tried benchmarking PT-Boats, benchmark based on upcoming Knights of the Sea strategy game. This benchmark features large open environment, and it seems that this was Achilles' heel on many graphics cards. Seeing PT-Boats killing performance on high-end cards only makes us wonder what developers thought.

Enabling anisotropic filtering to 16x is going to slay the performance like there's no tomorrow. However, we don't think that this is a GPU issue, but rather fault of un-optimised drivers.

Unreal Tournament 3
alt='ut3'
This game looks the best, and runs the best. Epic designed this game with Nvidia hardware in mind.

A lot of things have been said and bad blood has been spilled over UT vs. Quake III Arena, UT2k3 not being worth a dime, UT2k4 and return to top of the FPS roost, but this game is here, it looks awesome and, yes, it runs fantastically. Out of all compared games, this game gives the most details and runs best. Whatever Epic did, the guys did their magic and this game runs like a charm in all tested resolutions. We could not use AntiAliasing (have no plans to use nHancer and run a hack with 0045: Oblivion profile), since it was disabled in the demo (just like demorec and demoplay commands). This game is the only game that will run smooth as silk on mainstream graphics cards, thus we feel that sales results might go through the roof. Tim Sweeney is master of engine optimisation, regardless of if are we talking about the original Unreal visuals on 3Dfx Voodoo hardware, the performance of the first UT on slower cards or now, with UT3 running with no problems at 1920x1200 in well over 60fps with 16xAF enabled. Awesome result for a truly awesome game.

Looking at the visuals, it is a bit sad that the game is so frenetic. But, let's move to performance. We used the Shangri La map, with eight bots. All in-game details were set to maximum, of course. Seeing results in 1280x1024 being well over 100 fps only on all tested cards was quite impressive, but 8800GT did not run out of steam until 1920x1200, and even there the game was highly playable. There were no stuttering inside the game, but the menus slowed down to a crawl for some unusual reason.

In short
We have expected a game-changing product from G92-200 series from day one, and this is exactly what happened. Performance was quite surprising, with AA both turned on and off. Seeing that a 250-dollar card at default clocks is competing against parts that come at double the price brings us back to two legendary products, Radeon 9500 Pro and Geforce4 Ti4200. Nvidia did a bloody good job here, and when it comes to silicon, there is no major cause for complain (their PR schemes are another thing). Nvidia fixed various problems from the previous generation and now you do not need to own a 2600/2900 to run HD videos at 2560x1600.

When you compare it to previous $199 card, the Geforce 8600GTS, you will almost feel robbed, since this product runs rings around it and is definitely worth shelling those 50 dollars more. Both Nvidia and ATI failed badly with performance of their mainstream parts (with previous gen destroying 2600/8600), and this product is finally one you can call "that's it". Price is such that buying an 8800GTS has just become quite useless, shelling out half of the price for the second 8800GT will not justify the savings. If you own 8800GTX or an 8800Ultra, you're safe and sound. It just might be that the whole three-way SLI was invented for owners of 8800GTS/Ultra rigs, so that they don't feel ashamed when a possible three-way 8800GT beats the much more expensive setup.

We feel that opting for a card with 1GB of on-board memory will be a better choice, since both Crysis and UT3 is extremely dependent on the amount of system and on-board memory. Longer shaders require more and more memory, and with high-res textures and model normalisation being a norm, boards with 256, 320, 512 and 640MB are in for some rough time.

We don't know who works in Graphzilla's marketing department, but not calling this part 8850GT and confusing the market with a GT part that outperforms GTS and even surpasses GTX in a test or two just make our head wonder. Luckily, AMD did away with 3800 series, so no more GT/GTS/GTX/Ultra (Pro/GT/XL/XT/XTX/X2X) problems for those guys.

When it comes to 3800 series, real battle is there. Nvidia has the lead right now, but will AMD surpass Graphzilla? Bear in mind just one thing - DX10.1 is just a checkbox feature right now, the real meat lies with performance in sub-optimised and too slow DirectX 10 titles.

Should you go and buy this product from Gainward today? At the end of the day, we would wholeheartedly recommend that you take the plunge and get this card if Nvidia is dear to your heart. G80 marchitecture got new life with this one, and seeing the high product scores on default clocks only makes us wonder what will happen with factory overclocked Golden Sample boards.

The Good
This is a fixed G80. Nuff' said. It beats 2900XT, 8800GTS 320 and 640MB, and even 8800GTX in some cases. This card has enough horsepower for games, but do not expect performance FullHD in with single card in every game. Brilliant piece for Crysis, Hellgate and UT3.

The Bad
Drivers are green at this time, install those that are shipped on CDs, since rushed Crysis-demo beta driver does not warrant stability. 256-bit bus could become a bottleneck in next batch of games, VP processor could have been newer VP3 and not VP2

And the Ugly
Single-slot cooling may be cool, but our board heated up significantly. Temperature was always in high-60 (Celsius) range, overclocking capabilities were limited due to single-slot cooling.

Bartender's Verdict
Get-the-beers-in

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot – a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Heartbleed bug discovered in OpenSSL

Have you reacted to Heartbleed?