SAN JOSE, CA: NVIDIA HAS kicked off its annual GTC developer conference with the traditional keynote from its leather-clad CEO Jensen Yeung.
The ninth GTC is predicted to see up to 9,000 delegates descend on San Jose, California for the latest innovations from the world's biggest GPU manufacturer.
It is also the first to break the 10 per cent barrier for women delegates as the company works towards diversity targets.
Although originally best known for its gaming prowess, the rise of the GPU as an alternative to standard CPU processing has seen the conference become more and more about Artificial Intelligence, autonomous vehicles and cinema.
The show kicked off with a skit based on the most recent Star Wars film, but rendered in real-time using ray tracing.
Despite being imperceptible from the real thing, the newly announced RTX technology produced alongside ILMxLAB and Unreal Engine was produced on a single terminal running four Volta GPUs instead of a supercomputer.
"People can actually use it," enthused Jeung, explaining that Ray Tracing is now within reach of all.
A new Quadro GPU 100 was announced, painted in shiny gold to become the first workstation Volta-powered GPU.
NVLink adds a new technology which enables two GPUs to work seamlessly as one to form a single 10,000 core processor. The first workstations to include it will be announced later today from HP, Dell and Lenovo.
The theme this year was more about an improved software stack that improves efficiency, and the new bonding techniques for multiple GPUs means the new catchphrase is, "The more you buy, the more you save".
The Volta V100 has been maxed out - the same device now comes with 32GB of HBM2 memory available both physically and via a virtualised rig in the cloud.
The big news comes for the "World's Largest GPU" - the Nvidia DGX-3 with 16 Tesla V100 32GB GPUs connected by NVSwitch, offering 81.920 CUDA Cores and 2000 Teraflops of Tensor Cores.
The 512GB HBM2 memory offers 14.4TB/sec aggregate - in real terms - 1,440 HD movies per second transfer rate.
It's not really set up for the bedroom though, weighing in at 350lbs (almost 160kg in new money) and requiring 10,000 watts of energy to run its 2 Petaflop brain. In short - it's 10 times faster than the six-month-old DGX1. Screw you, Moore's Law.
The whole shebang is yours in the second half of 2018 at $399,000.
Notably missing was any mention of Bitcoin or cryptocurrency. It had been rumoured that a specialist GPU for mining was to be announced, but in a post-keynote statement, Jeung said: "Nvidia is not involved in Bitcoin. At all."
A range of new libraries to the Nvidia AI Inference engine have been announced starting with TensorRT4, TensorFlow Integration from Google, Kaldi optimisation and a new ONNX back end.
Kubernetes has been GPU-accelerated for the first time, adding Docker containers to the joy. A demonstration with scaling out a container on Kubernetes was staggering to watch with speeds tangibly, for the want of a better term, insanely fast.
In talking about autonomous vehicles, Jeung spoke of how soon all vehicles will soon be autonomous and with each vehicle creating petabytes of data every day, he believes that "data is the new source code".
The final big announcement - Orin - puts 2x Pegasus into a single chip, but there is no information on when we can expect to see that as, Pegasus itself is still to roll out fully; but it's an interesting insight into just how far ahead the tech companies have to think.
The Nvidia Drive Sim is a virtual reality AV simulator which uses the same chipsets as Nvidia-powered cars, but gives a chance to use those petabytes of data from its 370 motor companies to create real-life simulations, to bridge the gap between real-world testing and previously blocky simulations.
The new Isaac SDK brings these simulations to other areas such as robotic factories and healthcare, allowing developers to test their software without the robot arms, or whatever. The Clara system brings 3D mapping to the human body.
The demo video showed a tiny WALL-E esque robot running on a Jetson developer kit.
Finally, there was an update on the ‘Holodeck' VR testing environment. Although not a new product - we heard about it last year - we got to see it being used for "teleportation" (his word, not ours) - in other words, a driver in the room with us, driving a virtual car in a headset, which controlled a real driverless car at a remote location.
In a keynote that was perhaps a little underwhelming for new announcements, it's a reminder that it's quite often because Nvidia is so far ahead of the curve. µ
Now you can watch documentaries about horribly disfigured people whenever you like
Brad to the bone
Being in a minority of one doesn't make you right
WeWork needs a rework