The Inquirer-Home

Creative cuts life support for 3DLabs

The Last Mohican of OpenGL
Tue Feb 28 2006, 09:30
IT REALLY seems that, in this industry, a good product is hexed to disappear, one way or another, while the opposite happens to the utter crap - Alpha CPU and Windows "quasi-OS" are good examples of each category. Who'd think the same would now happen to the British-born 3DLabs, the "Queen of the OpenGL Empire"?

OK, let's go back a bit - what actually happened?

Realizm - the (unloved) champion
After its acquisition by Singapore-based Creative Technology, 3DLabs was expected to get a fresh potion infusion to propel it to the forefront of the workstation 3-D graphics fight, ahead of ATI and Nvidia. After all, 3DLabs - itself a combination of merged Intergraph Wildcat and the 'original' 3DLabs group - did have the engineering strengths, intellectual property and impressive track record at achieving excellent OpenGL 3-D performance in both engineering and multimedia apps. They were the regular winners of the ViewPerf OpenGL benchmark runs by SPEC.

Widely respected for its industry leadership in high-end dekstop 3-D, second only to the golden age of Silicon Graphics, 3Dlabs pioneered many OpenGL firsts, including the development of the OpenGL Shading Language, and even released an open source version of the OpenGL Shading Language compiler front-end to stimulate creativity in this niche market. Also, they were among the firsts to exploit OpenGL parallel GPU approach on the PC long time ago with their GLiNT processors - it seems British do have a particular fondness for parallelism, from Inmos Transputers for CPUs, to Quadrics QsNet for interconnect, to 3DLabs OpenGL chips in graphics... technically they were the leaders in all three... oh if just the business acumen and PR were on the same level?

The Realizm series of cards, announced some 18 months ago, was the pinnacle of the 3DLabs effort over the years - the Realizm VPU (Visual Processing Unit) graphics processors included quite a few unique features for the time, like hierarchical Z-buffer depth culling and the direct display of 16-bit floating-point values - in the first 3D graphics pipeline with consistent true floating point from the input vertices to the final displayed pixels, which, according to 3DLabs then, gave users new levels of image quality and realism, pun intended.

Being the first to directly natively support 3840x2400 displays like the IBM T221 (Nvidia Quadro at the time needed quite a few driver fixes for this), coupled with isochronous command channel with fast context switching and automatic hardware scheduling, meant glitch-free effects with real-time video even at IMAX-like 4K-level resolution of the above display - again a feature ahead of its time. Optional HD genlock and framelock were there too for professional video.

But most important Realizm feature was something that makes today's "updated" SLI and CrossFire look positively puny - its smoothly scalable parallel-GPU approach similar to big graphics engines by SGI and Evans & Sutherland.

In short, besides the Visual Processing Units (VPUs) - which by themselves were full-fledge highly pipelined GPUs with programmable vertex, fragment and pixel shaders, huge instruction counts (up to 256K instructions for fragment shared, not counting any loops!), two dual-link DVIs and 512 MB GDDR3 per VPU (plus 16 GB virtual addressing from system RAM) - there was another type of chip. These were Vertex/Scalability Units - VSUs - that put together many VPUs in scalable parallel configuration, supporting even broadcasting of commands and data without having to, for instance, split a single PCI-E X16 bus into two X8 busses.

VSUs were even more than that - each VSU has two vertex shaders which take over that portion of the job from VPU, improving the polygon processing tremendously. With its own 128 MB GDDR3 "DirectBurst" display list memory, VSU can keep all the polygon data from the application on the card, instead of system memory, something still found wanting on the "incredible" Nvidia Quadro up to today. Besides more consistent performance through a local low-latency memory access, this also frees up system memory and PCI-E bandwidth for other use. VSU also used the "checkerboard" load-balancing approach across two VPUs, seen now in CrossFire.

alt='realzone'

One Realizm speciality is the Ratelock feature, where the application updates the hardware graphics subsystems with the minimum frame swap period it would tolerate. If one of the graphics subsystems detects that it cannot complete the rendering of a frame in the required time, it simply discards that frame and moves on to the next one. This lets the system as a whole to maintain the required frame rate overall.

Now, the card that did all that was Realizm 800 - a full-length PCI-E X16 device covered almost completely in black cooler - with one VSU backed up by 128 MB RAM, running two VPUs with 512 MB GDDR3 each. When finally shipping a year ago, it broke up quite a few OpenGL application performance records. You can still find it in many brand-name vendors' highest-end workstation configurations.

alt='realztwo'

This approach would evolve easily into an elegant, single card quad-GPU solution, much ahead of what SLI or CrossFire offer right now - no clunky cables needed, multi-screen capability not sacrificed, and far more balanced overall architecture. Even with multiple-cards, the interconnect would not have to carry final pixels for compositing, but parallelise the operation at vertex level from the beginning.

Poor execution leads to - your execution!
So, why did it fail? Of course, one obvious reason to prop up is that, with its niche focus, 3DLabs, even with Creative's deep pockets, couldn't follow the quick, almost quarterly, 3-D GPU updates that resulted from the perennial ATI vs Nvidia high-volume battle - in a way, 3DLabs is a sort of collateral damage of the GPU giant's war.

On the other hand, even after the takeover, the business effort to push the product and platform was, to say mildly, lacking - it was in a way worse than "stealth marketing" of Alpha by Digital, and we all know what happened to both Alpha and Digital (at least the bosses got the money, as it usually happens). Even the press in this field would rarely, if ever, receive a call from Creative-3DLabs giving updates or testing platforms.

The chance was, with someone as deep-pocketed and well-branded as Creative, that 3DLabs could hit back at ATI and Nvidia by going to the mainstream "bottom" from the high-end "top", bringing its excellent features into the volume market. Unfortunately, it didn't execute this well, the management was, as I know Singaporean mentality well, probably impatient to make quicker money peddling Zens against Apple rather than looking at longer term win, and the rest is last week's history - 3DLabs workstation graphics really hit the bottom, and it was executed itself instead.

It didn't help that Creative is way too close to Microsoft - the chief hangman of OpenGL in favour of its disastrous DirectX - in its fight against Apple. It's hard to guess what happened in the boardrooms, but I don't believe Micro$oft went on their knees to Creative to preserve the flagship OpenGL workstation card brand, when their own Windows Vista is expected to dispose with direct OpenGL drivers altogether (and run them through a potentially horrible DirectX layer instead).

Now that the remaining few bones of 3DLabs will only handle graphics for mobile phones and PDAs - and I beg to differ that it has any better chance there against entrenched Nvidia and the gang - could it be all over for the high-end PC OpenGL 3-D graphics platforms?

After all, the old Evans & Sutherland platforms, with their superb anti-aliasing, are long gone for years, and SGI, the skeletal remain of once-mighty Silicon Graphics - the creator of OpenGL, seems to be counting its final days. On desktop and workstation platforms, among X86 OS, only Linux and Solaris will have native OpenGL after Windows Vista rears its Medusa multi-head (8 different heads up to now, I guess?). Are the engineers, designers and multimedia producers going to be stuck with imprecise, proprietarry and overall messy M$ Direct3D? I don't know, but I hope the industry will see this through and either force the "Chief Illuminati Reptilian" from Redmond to re-instate native OpenGL, or give greater backing to Linux.

I guess it will be Nvidia and ATI, the potential buyers for this 3DLabs technology, pushing for this - otherwise, what will happen to the sales of their expensive, OpenGL specific, Quadro and FireGL cards? ยต

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot โ€“ a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Heartbleed bug discovered in OpenSSL

Have you reacted to Heartbleed?