The Inquirer-Home

AMD's R600 has something of the flawed diamond about it

First INQpression Did anyone say Airbus?
Mon May 14 2007, 04:17
WELCOME TO THE FIRST installation of a five part article review, followed by at least two technical articles.

In this first review, we address our experience and performance on Windows XP only, with a Windows Vista INQpression (DirectX 10 tests included) to follow very soon. We will also compare Crossfire in both Windows XP and Vista - against the GeForce 8800GTX and GTS in SLI.

If you do a price comparison between Nvidia's top end and ATI's top end offerings, you end up with two versus one - two HD 2900XTs versus a single 8800 Ultra. Yes, you've read that correctly. The ATI Radeon HD 2900XT is priced at $399, but in Europe some price difference will probably happen. Still the fact remains that you can buy two cards for the price of one, and we have to give kudos to DAAMIT for bringing this product at such price range.

What happened to the Perfect 10?
We have been following development of this product for over a year now, and we can say that R600 is both witness and victim of internal struggles - through the past couple of months, AMD's takeover schemes resulted in AMD's own employees acting like AOL did in case of Time Warner takeover. With some very recent rumours of final death of ATI as such floating around, our argument from yester year could probably become a fact with arrival of the next-gen chip - R700.

AMD has a lot of things to solve and this is just one of the reasons why the R600 is late. Other reasons include dthe evelopment of different PCBs that may never come to life, fiddling around with different coolers and most importantly - a troubled birth of one very complex and problematic chip.

ATI lost time with different orders coming from AMD. AMD was not aware of the challenge it had to bring a product that has more transistors than the upcoming quad-core and dual-core K10s combined, and most of the R600 chip is core logic - not cache. We also wonder what happened to Henri Richard's statement at tje Q1 conference call (summary available here), since he promised a "hard" launch of 10 products in May and said that AMD does not do soft launches.

Out of those 10 promised products, only one DirectX 10 product is being hard-launched today, and that product is the Radeon HD 2900XT. The other available product is a DirectX 9 part, the Mobile Radeon HD 2300.

The Board(s)
alt='amd_r600_airbus'
Is it a great coincidence, or did AMD and
Airbus share the same software in developing these products?

We got two boards from AMD. Both are engineering samples, and they differ a little from the retail cards. Most notably, the thermals are a bit different on final boards. We were surprised to see that after all the different concepts that ATI was wasting time on - R600 boards are very similar to Nvidia ones. In fact, turbine-style fans on all three cards are identical - 75mm turbines that take the air from inside the case and blow it outside through rear-bracket. We can only commend ATi and Nvidia for this, since tossing the hot air out helps to lower the case temperature.

alt='amd_r600_02'
Size-wise, AMD is literally positioned between two Nvidia boards...

The board itself is only longer than the 8800GTS by a centimetre: 8800GTS PCB (Printed Circuit Board) is exactly 228 mm long, HD2900XT is 241mm, while 8800GTX measures in 267mm.

alt='amd_r600_04'
Backside of the board reveals a plastic stress-reliever,
this backplate helps the structural integrity of the board - this cooler is much
heavier than Nvidia's

When we removed the coolers, we saw no fewer than 16 32MB chips. It seems that AMD used the cheapest chips out there (Hynix HY5AS573225A FP-11 GDDR3 - very same chips used on ton of 8800GTS 320MB cards), but they proved to pack some serious punch, reaching almost 2 GHz clock. With all the initial boards being manufactured and sold to partners by AMD itself, we did not see any partner going rogue and releasing a card with 1GB GDDR-3 memory - a darn shame, if you ask us. You would only need to put 16 64MB chips and you would be home, safe with 1GB on a sticker, and somehow yours truly feels this would raise performance with 16xAA/16xAF modes.

alt='amd_r600_01'
Whoever assembled the board, he put too little thermal paste on it and chipped one side with the cooler

Chip itself is positioned at a 45 degree angle, and sadly, we are seeing that a metal shim that protects the chip is a bit higher than the chip itself. What will this make to enthusiast community, we're not sure. But, at least you have direct access to the surface of the chip, no need for a dangerous IHS removal procedure, like on 8800 cards.

Biggest innovation of this card is arrival of digital power regulation, e.g. waving good-bye to all those "caps" and "bombs" that are still present of Nvidia cards. Real virtue of Digital PWM (Pulse-Width Modulation) is ultra-high efficiency and cleaner design on the board, yet alone increased lifetime. Of course, biggest advantage of digital power regulation is resilience to higher temperatures, and trust us, this baby can pack some serious heat. ATi opted for digital three phase variable regulation: largest chip was Pulse PA1314NL, followed by PA1312NL and smallest PA0511.

Above Digital PWM, a controversial 8-pin PEG connector found his place, next to him is a classical 6-pin PEG connector. All things considered, this board could pull a combined total of 250 Watts. And here comes the difference between retail cards and our engineering samples - the retail ones will have no problems running in 6+6-pin configuration, even overclocking is possible. Our engineering ones were quite picky when it came to booting with 6+6-pins. One of our boards was a happy overclocker with 6+6-pins, the other did not want to boot at all. We would have a light show of red LEDs in front and on the back of the boards and the system would not boot.

When it comes to the coolers themselves, ATI packed 75mm turbine with acrylic casing and red-coloured hard plastic on front and black glossy plastic at the back. This two plastic components bear the weight of dual-heatpipe copper cooler that spans across the board. So, no special cooling like before mentioned vapour chamber or water-cooling. Bear in mind that some partners will bring those special product to retail market, depending on the volume.

Having said this, the idea of new connector is just plain awful. We know that the card eats 250W and before 65 nanometre chips pops along, this will be known as the power hog of the 3D era. While I can see a lot of my colleagues criticising DAAMIT for doing this, we can only tell you that the final judgement about 8-pin connector can only be given with the arrival of R700 and G90. If next-gen Nvidia and ATI cards continue to use this one, buying an 8-pin compliant PSU is a good idea. If this is a one-off, then this is a disastrous idea.

Alternatively, we learned that Corsair, OCZ and BeQuiet have shipped rewired PSUs or are sending new 8-pin connectors to owners of already existing power supplies. We know that AMD builded up an eco-system of PSU manufacturers for their Athlon and Opteron product lines, and AMD will do the same partnering program for Radeon line. Thus, you can expect at least 20-30 power supplies being introduced to the market with official announcement of R600.

Problems? What problems?
Let's look at the facts. R600 is no less than nine months late. The chip was originally scheduled for launch on September 8th as our sources were implying at Computex. Whether they were misleading us or not, we'll leave to history to decide. That did not happen, and we are here, on Monday in mid-May - with product that had some compatibility issues with our testing systems.

Why are we saying this? Because of all the problems we had in the past two weeks. To us, it is unacceptable that drivers are often faster with 4xAA and 16xAF than without. It is extremely weird that we witness higher framerates in 1920x1200 and 2560x1600 than in 1280x1024. Drivers require some work, but already now, you can enjoy extremely good and soft 16xAA mode with major performance penalties, but still end up with enjoyable and playable framerate. If you don't like soft feel of the scene when ATI's FSAA is up'n'running at 16x (wide-tent), you can always opt for a mod that will put the scene in 24x and 12x, and those two modes bring the sharp AA.

Reason why we are not bringing multi-GPU scores now is the fact that in a lot of tests, single GPU configuration was faster than CrossFire one - this happened both on Asus i975X board and on Asus 580X (formerly known as Radeon Xpress 3200). Thus, we are waiting for CrossFire driver. In comparison, Nvidia is bringing DirectX 10 SLI support, and you can expect that our DX10 SLI vs. CrossFire shootout during this week will bring some fireworks.

Testing
In this first look at the product, we opted to do a review on platform independent of both vendors, so we opted to put MSI's GeForce 8800GTS 640MB, EVGA's e-GeForce 8800GTX ACS3 and of course, ATi's Radeon HD 2900XT 512MB.

AMD and other Nvidia partners might cry foul over our selection of Nvidia 8800GTX card. For the past couple of months, we have been using EVGA's ACS3 board, which comes with 626 MHz GPU clock, 128 scalar shaders are clocked at 1.45 GHz and 768 MB of GDDR-3 memory is happily working at 1.0GHz in DDR mode (2GHz), yielding bandwidth of 96 GB/s (default GTX: 86.4 GB/s). So, when you are seeing GTX scores in our test, bear in mind that you are seeing a product that beat reference 8800Ultra in quite a lot of tests.

However, the Radeon HD 2900XT is currently the best thing what AMD came up with, so we wanted to compare it with currently best offering from Nvidia's camp. If any other AMD/nV partner feels that this statement is invalid, he/she is more than welcomed to send us a product they consider to be better than EVGA's own.

We have used the following components for our test:
Intel Core 2 Extreme QX6800 2.93 GHz
Asetek VapoChill Micro air-cooler
ASUS P5W DH Deluxe i975X-based motherboard
2x 1GB GeIL PC2-8500/9600 MultiSpec Kit at 1.06 GHz, 4-4-4-12
ATI Radeon HD 2900XT @ 742/1658 MHz
MSI GeForce 8800GTS 640MB @ 513/1584 MHz
EVGA e-GeForce 8800GTX ACS3 768MB @ 626/2000 MHz
Seagate Barracuda 7200 ES 250GB
Sony BWU-100A
Enermax Galaxy 850W

Software part was consisted out of Windows XP Professional Edition SP2, Microsoft Office 2007, .NET Framework 2.0, DirectX 9.0c April 2007 Update, FRAPS 2.8.2, latest drivers for the motherboard. For graphics cards, we were using ATi Catalyst 8.37v4, and for Nvidia's boards, we were using latest available official drivers - ForceWare 158.22.

The results of the test are as follows:

3DMark06
alt='amd_r600_3dm_01'
If we would have used regular 8800GTX, ATi's R600 would win here...sorry!

alt='amd_r600_3dm_02'
Fillrate however, shows that ATI has 16 ROP and 16 TMUs, while nV has 20/24 ROPs and 32 Texture Units

alt='amd_r600_3dm_03'
No contest in Pixel Shader test either - 128 scalar units at 1.45 GHz win

alt='amd_r600_3dm_04'
But in Vertex Shader test, situation changes dramatically

Company of Heroes - No AA/ No AF
alt='amd_r600_company-of-heroes'
CoH shows that ATi has some work to do...but only against 8800GTX - and the difference grows smaller as resolution increases

Company of Heroes - 4xAA/ 16xAF
alt='amd_r600_company-of-heroes_'
Putting the graphical goodies makes an interesting effect on 8800GTS and 8800GTX. ATi only loses out in 2560x1600

Company of Heroes - 16xAA/ 16xAF
alt='amd_r600_company-of-heroes1'
At 16x all, ATI loses out

F.E.A.R. - No AA/ No AF
alt='amd_r600_fear'
In F.E.A.R., it is obvious that something is going on with AMD on this setup...too many shadows?

F.E.A.R. - 4xAA/ 16xAF
alt='amd_r600_fear_4aa'
Interesting results are getting more interesting...

F.E.A.R. - 16xAA/ 16xAF
alt='amd_r600_fear_16'
No contest here. R600 is blown out of the water

S.T.A.L.K.E.R. - No AA/ No AF
alt='amd_r600_stalker'
New patch saved Nvidia's bacon in Stalker, but still, seeing such scores only makes us wonder what could have been... and what can happen in Crossfire setup.

S.T.A.L.K.E.R. - 4xAA/ 16xAF
alt='amd_r600_stalker-4aa'
What the heck happened in highest resolution?

S.T.A.L.K.E.R. - 16xAA/ 16xAF
alt='amd_r600_stalker-16'
R600 is one weird cookie...

Heating - A problem discovered
We have obtained, at the last minute, an IC thermometer with laser guidance, and we will be showing you these pictures in our future graphics card reviews. R600 vs. GTS vs. GTX (both single-GPU and SLI) thermal graphs will also posted during the week.

Yes, HD2900XT heats a lot. Does it heat more than Nvidia? Well, brace yourselves for impact, my dear readers, the answers might be quite surprising. In our testing inside the case, 8800GTX had a 100degC spot on the card. It was the very same spot that is heating like crazy on every nV card after 7800GTX (including the 7800GTX) - on the backside of the board, right close to the back of two 6-pin power connectors, we would see a 99-102degC spot. 102degC was reached inside a closed case after 1hr of 3DMark06 loop. MSI 8800GTS was also similar, and one spot at the back of power regulation was recorded at 92degC. Thus, seeing 88degC hotspot on the back of ATI card was actually surprising. We did expected 100 or so degrees Celsius.

Truth to be told, Nvidia's boards are far less hot and consume less power in 2D mode when compared to AMD's power hog.

R600 is a power-hog and heats up alot, but during our testing in single-GPU configuration, board did not exceed 90degC at any part of the backside. Truth to be told, 2D clock (513/1016 MHz) keeps the board at 66degC, while 3D will push the card to high-70s. Not as bad as we previously thought - I still remember my unpleasant feeling when I was seeing temperature of 6800Ultra going to high-80 and even low-90s. Please bear in mind that I do own an AC device, and temperature in my flat was kept at comfy 24deg Celsius.

Overclocking
The two boards we had showed some differences in overclocking: one board was tested to go up to 850/2080 MHz, but in praxis, we could not reach more than 840/1910. Second board did not came from any cherry-picked batch, and it worked with zero issues at 850/1966, and even 866/1966 was doable if we would aid the card with additional fan that would blow straight in front of the turbine, creating a forced air effect (exhaust of the air on rear bracket was noticeably higher). However, we will be focusing on 850/1966 results, as we consider these doable without additional aids.

This clock was stable even without forced cooling, so we opted for this performance boost of 108 MHz for the GPU and 308 MHz for the memory. This additional GPU clock created a jump in fill-rate by hefty 1.7 billion pixels per second, and memory bandwidth grew from 106.11 GB/s to massive 125.82 GB/s. And yet again, biggest performance boost was in highest resolutions.

Anyways, largest performance boost will be handed to you if you are not using AA or AF. We have seen 10-15% performance boost, but somehow we feel that both 8800GTX and HD2900XT are being held back. We will do some testing on Intel's V8 system later in the week, so that we can say to Mac Pro owners what kind of performance they can expect with one of these two babies. Of course, when they ditch Mac OS X and play games on decent gaming OS like Windows XP or newcomer 32/64-bit Vista. Without AA/AF, World of Warcraft jumped from 73.71fps to 97.76 fps, with frames never dropping below 49.55fps. We are talking about performance in 2560x1600, of course. In 1920x1200, average fps jumped from 84.94fps to 113.31fps, not dropping below 94fps at any given moment, or even more than the average on default clock was. When compared to EVGA's 8800GTX, card will score 112.96 fps in 2560x1600 and 118.83 at 1920x1200. What is a worrying fact for nV here is the fact that ATi's lowest fps in 2560x1600 and 1920x200 was higher in both cases than 8800GTX.

In Company of Heroes, with overclocking, HD2900XT will not surpass 8800GTX in any resolution, while ATi has some obvious issues with F.E.A.R. - performance is just abysmal here.

Stalker is a game that we do not need to comment anymore - patch 1.0000000003 finally solved problems that Nvidia had, and now HD2900XT is not 50% faster than 8800GTX (the case with 1.00000000000000001 patch. Result is now in favour of 8800GTX, while 8800GTS now also surpasses HD2900XT, until visual goodness is applied.

Now, you have noticed here that we're predominantly comparing HD2900XT to 8800GTX. The fact of the matter is that ATI has smoother framerate (lowest frames were higher than on 8800GTX) and our game experience was more comfortable and comparable between these two, than on 8800GTS, which would completely fell apart when 16xAA was applied, while ATI was more playable in WoW and Stalker. We are not certain why is 8800GTS 640MB falling apart in Stalker, World of Warcraft and Company of Heroes, as soon as AA/AF is applied. Nvidia told us that new drivers are bringing tons of performance improvements, just like ATI did. Catch 22 is - these drivers are for Vista, so we kindly ask for your patience and wait for the follow up article. We will be testing with ForceWare 158.42, and we are expecting 8.38 drivers as well.

Anybody in their minds have to ask themselves - if ATI created this product for world of XHD and Full HD resolutions, why in the world there is not an option for 1GB card? We wonder how the performance will be in DX10 games, where on-board memory does not make such a difference - virtual memory addressing should sort and boost FullHD/XHD performance for not just HD2900XT, but 8800GTX 320MB as well.

End of Part One
We have to give one up to ATI than GeForce 8800GTS. We were shocked to see that 8800GTS starts trailing behind or this inexpensive card or HD2900XT even leads the trio as soon as visual goodness is unlocked. Playing World of Warcraft with 16xCFAA and 16xAF in 2560x1600 is doable with a single card, and yours truly found himself lost in world of Outlands, with 40 or so frames per second. In comparison, price competitor cannot even come near. Bear in mind that our 8800GTX from EVGA beat 8800Ultra in a lot of benchmarks, so when compared to default 8800GTX with 9.6 GB/s less bandwidth and 51 MHz faster GPU clock (100 MHz shader clock), you would see far greater number of tests where default 8800GTX would be threatened by HD2900XT.

But, ATI has to solve a lot of driver issues, because we cannot accept the fact that card was often faster in higher resolutions than in lower ones. Also, even some AA/AF scores were higher than noAA/noAF - just plain weird.

Bear in mind that all of these inqpressions and personal experience of two testers came from Windows XP environment. We have learn to appreciate the maturity of nV drivers for 64-bit Vista (yeah, it looked as an oxymoron up to around one month ago), and we are looking forward digging into the performance of these boards under DirectX 10 environment and of course, featuring some new DirectX 10 apps.

But still, this is a product that is months late, it should have been polished more than a new Ferrari in a Ferrari Store (not the T-shirt one).

If you're wondering should you go for HD2900XT or 8800GTS640MB or even plunge for the 8800GTX, wait until the end of this week, when you will see the performance results in many different configurations.

Bartender's Initial Verdict
Windows XP and DirectX 9.0 applications on Intel based setup

AT RADEON HD2900XT 512MB
alt='beer08'

EVGA e-GeForce 8800GTX ACS3
alt='beer09'

MSI GeForce 8800GTS 640MB
alt='beer07'

Reviewed and tested by Davor Baksa and Theo Valich

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot – a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Heartbleed bug discovered in OpenSSL

Have you reacted to Heartbleed?