NVIDIA HAS FINALLY taken the covers off its next-gen Turing GeForce graphics cards, which come promising the power to deliver ray-tracing rendering without costing the price of a small moon.
Top of the crop of new cards is the GeForce RTX 2080 Ti which leaked earlier today. With 4,352 CUDA cores and 11GB of GDDR6 video memory with a speed of 14Gbps and bandwidth of 616 GB/s, the new graphics card has some serious pixel-pushing power. Top clockspeed for the RTX 2080 Ti hits 1,545MHz in its vanilla guise, but a "Founders Edition" will hit a nippy 1,635MHz.
Such power comes at a hefty price, with the Founders Edition RTX 2080 Ti set to fetch $1,199 (around £940) when it makes its debut on 20 September.
Following the flagship card is the GeForce RTX 2080, which comes with 2,944 CUDA cores and 8GB of GDDR6 with a 14Gbps memory speed and 448GB/s bandwidth. A Founders Edition of the card runs at a top speed of 1,800MHz, while the vanilla card hits 1,710MHz. The former version of the card weighs in at $799; a pretty steep price but not beyond the wallets of some PC gaming enthusiasts.
At $599, the Founders Edition of the GeForce RTX 2070 is a little more affordable, though Nvidia hasn't said when it'll release the card. The RTX 2070 card comes with the same memory configuration at the RTX 2080, only sporting a handful fewer CUDA cores; 2,304 to be exact. Clockspeed for the Founders Edition sits at 1,710MHz while the standard version runs at 1,620MHz.
Asus has also revealed its take on the GeForce RTX cards with prices for Blighty, with the RTX 2080 starting at £889.30 and topping out with a cranked up RTX 2080 Ti costing £1,344.
So those are the cards, which usher in what looks to be quite a hike in performance over the 10-series GeForce cards.
But you may have noticed that Nvidia is using RTX rather than the GTX prefix. That new nomenclature is a nod to the ray-tracing capabilities of the new Turing architecture.
We already knew Turing was able to handle ray-tracing thanks to the latest Quadro professional-grade graphics cards Nvidia showed off last week. But those graphics cards cost thousands of pound and are way out of the reach of the average PC enthusiast.
That led us to speculate how well the next-gen GeForce cards will be able to render ray-tracing. Well, if Nvidia's Gamescom 2018 presentation is to be believed, they can handle the slick rendering technique pretty well. And in real-time no less.
But before we go on, ray-tracing for the uninitiated is a rendering technique that traces the path of light rays that illuminates an entire scene; that includes the reflection of reflections in reflections. The end result is computer generated images and graphics look fabulously realistic and as shiny as the polished bonce of Chris Merriman.
Normally, ray-tracing would take a serious amount of graphical power to run; we're talking Nvidia DXG supercomputer level.
But Nvidia's founder and boss Jensen Huang has touted the Turing architecture as the largest development in Nvidia's history noting it's the "greatest leap since we created CUDA" and as such has the power and architecture to handle real-time ray-tracing in graphics cards the average Joe or Jill can buy.
Huang described the generational lead from the very capable Pascal architecture to Turing as a "shocking contrast", which certainty cranks the hype machine for the new GeForce graphics.
And they looks to be ready to deliver the goods as a series of upcoming games, such as Battlefield V and Shadow of the Tomb Raider, were shown off making use of real-time ray-tracing, which not only makes the games' graphics look a whole lot more realistic but also picks out of a hell of a lot more detail than when no ray-tracing is applied.
We don't want to fall victims of the hype and we'd need to see how these Turing cards perform in the new desktop PCs. But if they do deliver the performance Nvidia is promising, and Huang tends to now exaggerate too much, then we could be looking at a serious generational jump in the world of graphics.
And given how good many games look on base PlayStation 4 consoles - looking at you God of War - Turing GeForce cards could make future games look almost disturbingly real. µ
Bad for shareholders, mildly good for the planet
YouTube on the Tube
Claims that it hasn't ever actually worked