The Inquirer-Home

The Doom 3 myth exposed

Column To anticipate, perchance to have a nightmare
Mon May 19 2003, 19:47
THE UP AND COMING release of Doom 3 is easily one of the most anticipated events in the entire history of the personal computer, potentially equaled in recent years only by AMD's upcoming Athlon 64 launch and, when compared to the sum total of human history, by the Second Coming. For almost two years every video card that's been launched has been evaluated by its future standard, first all-but invisibly, and then, as the appointed day drew nearer, with increasing fervor. John Carmack may have done more damage to NVIDIA's low-end budget video lineup then the entire tech recession of the past three years combined when he uttered the (paraphrased) sentence: "Do not buy a GeForce4 MX to play Doom 3." He might as well have said: "Do not buy, consider, or look sidewise at a GeForce4 MX, period, for any reason."

For better or worse, Carmack is one of the rare individuals in the industry who's words are both heeded and respected. When Gates, Ellison, McNealy, or Jobs open their mouths people take heed, but the insults and detractions tend to fly just as quickly as the compliments and support. The closest person I can think of to compare to Carmack would seem to be Linus Torvalds, who, like Carmack, stays out of the political battles fought throughout the IT world and seems dedicated to building a better kernel to the exclusion of much else. Perhaps the reason people seem to listen to Carmack is precisely because he doesn't seem politically vested in every sentence that comes out of his mouth.

This, however, is a bit off-topic, even for the INQ. The point is, Carmack said: "Don't buy" and people didn't. Now, with the launch of the GeForce FX 5900 Ultra last week we've actually gotten to see our most modern video cards in action and the results are a bit sobering. It looks like Doom 3 is going to be the game that finally shatters the hardware vs. 3D engine supremacy that's been the case for years, and not even the highest-end cards available may be able to run it in 1600x1200x32 with all the eye candy turned on and AA / AF tossed in to boot. That right there, if you stop and think about it, is a turn-around from what we're used to seeing these days. Making matters worse, the game prefers NVIDIA video cards over ATI cards so much at this point it's not even funny. ATI had best hope that the reason the GF FX series cleans the Radeon's clock so brutally is because of a driver error, because if it isn't NVIDIA is going to gain a massive benefit from gamers looking to upgrade for the game.

But even the mighty FX 5900 Ultra starts to slip once you crank up the detail and AA / AF settings, which implies that come launch day there may be a lot of very unhappy gamers yammering away in various forums as their once-mighty video cards choke and die on D3's new engine. Based on what we've seen I'm not even sure if a GeForce4 Ti 4600 will pack much of a punch when push comes to shove. In fact, you might not want to own any video card at all that isn't fully DX9 compatible. Is this bad? Not really in the sense that people want to see something really spectacular from Doom 3—but it does show the danger of future-casting. Reviewers have tilted their video card recommendations for years over Doom 3, its been breathlessly discussed at every new product launch, and even Carmack himself said not to buy a GF 4 MX to play it. Based on available information, however, you might not have wanted to buy a Radeon 8500 either. Or a GeForce4 Ti card. Or anything below a Radeon 9800 Pro.

When Quake 3 launched gamers could get away with old video cards (potentially even old Voodoo2's) because Quake 3 was about speed and fragging, not looks. Doom 3 doesn't have that luxury and neither do the players who want to experience it. If you're a gamer to whom atmosphere and ambiance are important words, you'd best start saving your pennies now—and if I were you, ATI, I'd be making a press release about how new drivers offered massively improved Doom 3 performance soon. Real soon. µ

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot – a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Heartbleed bug discovered in OpenSSL

Have you reacted to Heartbleed?