The Inquirer-Home

SLI beyond HD quality - at what cost?

First INQpressions SLI and the Family Cross
Thu Dec 28 2006, 16:12
SINCE ITS appearance, Nvidia's SLI and, similarly, ATI CrossFire, had two key benefits: increasing the performance at a given resolution, and keeping the constant enjoyable performance (read: frame rate) as the resolution - or, alternatively, anti-aliasing - increases.

The most recent GeForce 8800GTX cards should, on their own, be fast enough for almost any current game. What happens when we go very high with the display resolution? Will SLI show more of a benefit, and make the (expensive) difference between choppy and smooth (let's say 30 fps and above can be considered true 'smoothness') 3-D play?

After all, Dell's 3007 monitor, with 2560x1600 across 30 inches, is now less than US$ 1,300 on line. With similar price in Singapore, even I got finally hold of one of them to add to the local test lab resources. Now was the time to test how far SLI will go at this level. And, while Dell warranty doesn't cover you for less than 5 dead pixels, I got my unit without any dead ones.

The configuration was simple, fresh from the yesterday's chipset & CPU shoot-out: Intel Core 2 Duo E6700 running at 3.5 GHz with FSB 1400 at 1.31 volts, GEIL dual-channel DDR2-700 memory with CL 3-3-3-5 latency at 1.875 volts, all on ECS Nforce 680i board and 8800GTX GPU. In SLI configuration, I added a Sparkle 8800GTX second GPU (both GPUs are of course identical, except for the sticker colour) and the same 500W MGE PSU - it sustained the SLI with quad core CPU before, and should take care of it with a cooler dual-core setup too. Just like the previous review, this test was also done in non airconditioned test lab at 32 C room temperature without any room fans, thus clearly showing any heat concentration in the system.

3DMark 06 is demanding on the frame rate even with the fastest GPUs, and, for the first time, I tested it at three new resolution settings: 1920x1200 (WUXGA or HDTV1080+), 2560x1600, and 2560x1600 with 4x anti aliasing plus anisotropic texture filtering. All test runs, done without turning off or resetting the system, also included the CPU benchmark, to ensure that the processor and memory can really stand this tough all-round burn-in test in the heat - if yes, most probably they can run anything else well, too.

Here are the results - at UXGA level, I've added the quad-core Kentsfield results for an indication of CPU influence on the score, all ran on the same board, and same WinXP SP2 installation:

GeForce 8800GTX Single GPU Dual GPU Difference %
3Dmark06 overall
UXGA Kentsfield 10579 15018 142
UXGA Conroe 9894 14752 149
WUXGA 1920x1200 Conroe 9256 14196 153
WQXGA 2560x1600 Conroe 7125 11599 163
WQXGA 2560x1600 4AA anisotrop 4567 7851 172
3Dmark06 details - GT1 fps
UXGA Kentsfield 34.4 50.1 146
UXGA Conroe 34.9 56.9 163
WUXGA 1920x1200 Conroe 31.9 55.5 174
WQXGA 2560x1600 Conroe 23.4 42.7 182
WQXGA 2560x1600 4AA anisotrop 13.6 25.9 190
3Dmark06 details - GT2 fps
UXGA Kentsfield 35.1 50.6 144
UXGA Conroe 35.8 57.9 162
WUXGA 1920x1200 Conroe 33.6 56.3 168
WQXGA 2560x1600 Conroe 26.3 46.9 178
WQXGA 2560x1600 4AA anisotrop 16.9 30.6 181
3Dmark06 details - HDR1 fps
UXGA Kentsfield 39.3 69.5 177
UXGA Conroe 39.4 69.6 177
WUXGA 1920x1200 Conroe 36 64.1 178
WQXGA 2560x1600 Conroe 25.9 47.1 182
WQXGA 2560x1600 4AA anisotrop 17.5 31.9 183
3Dmark06 details - HDR2 fps
UXGA Kentsfield 43.1 63.4 147
UXGA Conroe 43.6 74.2 170
WUXGA 1920x1200 Conroe 39.1 69.7 178
WQXGA 2560x1600 Conroe 26.9 49.6 184
WQXGA 2560x1600 4AA anisotrop 14.1 26.3 187

As you can see, the performance doesn't drop that much with the resolution, but the SLI premium seems to slightly grow in importance as the resolution rises in a couple of test runs, where it makes a noticeable difference in the frame rate (i.e. smoother visuals), especially at the 2560x1600 level. Basically, in SLI mode, the frame rate drops less as the resolution goes higher. Also, notice that Deep Freeze benchmark performs somewhat slower with Kentsfield compared to the Conroe - could it be due to some FSB-related bottleneck?

Looking at the 3-D tests on a 30-incher with 2560x1600 4-megapixel resolution gives a kind of 'cinema feel' very different from a regular UXGA 20-inch LCD (which is still high end for most), and also makes anti-aliasing look redundant. Such a display would normally be fully appreciated from a yard away in regular desk use (compared to roughly 2 feet for the 19 or 20 incher) and, at that distance, any jagged edges on this large screen would be very hard to notice. When using 4x AA settings, I did notice somewhat better overall quality and a bit smoother lines, however the difference was minimal compared to the same setting benefit at, say, 1280x1024 SXGA resolution. Once we move to the next level, the 3840x2400 resolution, I believe anti-aliasing will become a rarely, if ever, used option.

On the dual core system, the heat concentration was lowest on the CPU and memory, higher on the North Bridge, and by far highest on the GPUs, whether in single or SLI mode. In both cases, the GPU heat varied at around 75 - 76 C according to Ntune, compared to 40-42 C for CPU and 55 to 58 C for the Nforce. On the power front, I recorded 387 W peak consumption with SLI compared to 266 W with single GPU, still around 50 W less than similar configuration with the quad-core setup tested previously.

Keep in mind that these two GeForce 8800GTX cards together in SLI alone cost just as much as the Dell 3007 monitor. In my mind, for anything up to 1920x1200, it is hard to justify spending on such dual card configuration - but, if running the beast at 2560x1600, by all means, go for this SLI, and yes, you'll have good tangible frame rate benefits in most cases. For the sake of heat, though, I'd recommend a better cooling system for that 8800GTX SLI: either the Peltier Junction in the Sparkle Calibre, or the BFG water cooled version - both seem to offer similar cooling benefits, but the BFG cards is slimmer and somewhat more expensive.

Also, most of this year's mid-range and high-end cards in both Nvidia and ATI camps have at least one dual-link DVI port, enabling all of them to directly support this resolution - which is fine for multimedia or desktop publishing. However, for gaming, even the 8800GTX benefits from extra SLI twin in achieving smooth 3-D frame rates at 2560x1600 - how will then the lower-end cards fare in high-speed games in that case?

And, oh yes, as I speak, the rumour mill says that the Dell 3007 monitor price is expected to come down near US$ 1,000 level next month... umm, 2560x1600 at every home? Yes, if the graphics card can feed it - the Taiwan monitor brands will not sit and let Dell take away the cream, which should lower the high-end monitor prices further in the next six months, plus bring the 3840x2400 into the game again, at a better value. Good for all of us. ?

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot – a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Heartbleed bug discovered in OpenSSL

Have you reacted to Heartbleed?