NVIDIA HAS REVEALED its first suite of Turing GPUs capable of powering ray-tracing, the next big step in realistic graphics.
For those of you who aren't savvy with the world of graphics, ray-tracing in a rendering technique that, surprise surprise, traces the path of light rays that illuminate objects or fill a scene in a video, image or virtual setting. The technique allows for cinema-level virtual illumination that delivers more accurate lighting, reflections and shadows, making the thing being rendered look a heck of a lot more realistic.
The problem is, ray-tracing takes a hell of a lot of graphical grunt to do; far more than any single current consumer graphics card can really handle.
But Nvidia's founder and CEO Jensen Huang finally announced the company's next-gen Turing architecture on Monday, and revealed that the first cards making use of the new GPU architecture which he claims will offer the grunt to power ray-tracing rendering.
But unfortunately, they won't be coming to consumer PCs just yet, unless you have very deep pockets, as Nvidia's first Turing cards are the professional-grade Quadro RTX 8000, Quadro RTX 6000 and Quadro RTX 5000 GPUs.
With Samsung 16Gb GDDR6 video memory on board, running up to 48GB in the RTX 8000 to push data around faster than before, Nvidia noted that Turing GPUs are some 26 times more capable of handling ray-tracing than the Pascal generation GPUs. This is thanks to ray-tracing cores that will figure out how light and sound travels through 3D environments.
Designed for data centres and workstations, the Turing Quadro GPUs also come with Tensor cores to accelerate deep learning neural networks for running artificial intelligence systems, something Nvidia is big into alongside pushing polished pixels.
With the RTX 5000 starting at $2,300 and the RTX 8000 hitting an estimated retail price of $10,000, don't go expecting to find them in any affordable workstations any time soon.
But these pro-grade cards to point towards a promising next wave of powerful GeForce graphics cards, which will likely use GDDR6 and perhaps have some capabilities to support ray-tracing in some form; just don't expect it to be the very highest-end example of the slick rendering technique just yet. µ
The week in Google
The scandal that just keeps giving
Clip to the end....