SAN JOSE: DAY TWO of the Nvidia GPU Technology Conference (GTC) saw the keynote stage handed over to Rob High, IBM Watson's chief technology officer.
High, with his flowing professor locks and Velcro shoes, was an ideal candidate to take the slot, given Nvidia's emphasis on artificial intelligence in the opening keynote yesterday, as well as the concurrent running of the OpenPower Summit.
High used the slot to distil the theory of cognitive computing in ways that show exactly why Watson has become a watchword for the topic.
The supercomputer, which won $1m at US game show Jeopardy in 2011, has increased in complexity. High explained that it has now come into its own, rather than simply recalling 'factoids', and actually understands context and nuance.
The difference between cognitive computing and artificial intelligence, he explained, is that the former seeks not to recreate, but to augment, the human mind.
"I don't want my computer to do my thinking for me, I want it to do my research for me," he said, citing the medical profession where doctors are held back by the need to keep up with the latest theories, but often have only a few hours a month to digest weeks' worth of reading.
Handing the research over to Watson, and combining it with the cognitive understanding of diagnosing symptoms, puts the knowledge at doctors' fingertips without them having to do all the reading. Or as High put it: "Bringing the breadth of all human knowledge to the tip of our tongue."
This leads on to the fact that the way we interact with computers needs to change. High believes that cognitive computers need four skills - to learn, to express themselves with human-style interaction, to provide expertise, and to continue to evolve - all at scale.
People who claim not to be tech savvy, he explained, tend to be intimidated by the way we currently interact with computers, pushing the need for a further 'humanising' of the process.
This has led to initiatives such as Cognitoy, a learning robot that uses Watson to identify whether an infant at play has any special learning or development needs before they manifest to parents and teachers.
At this point, we started to see where science fiction is becoming science fact: an Aldebaran robot with Watson aboard was seen conversing with an unseen human on topics that somehow got onto the subject of Taylor Swift, leading to a spirited rendition of Shake It Off complete with robotic dad-dancing.
"I thought you said you were a good dancer?" says the unseen human.
"Oh, yeah? Watch this!" says the robot, sensing the mocking tone and bursting into a full-on Gangnam style.
These social robots are, according to High, part of the pivotal change as we move towards true cognitive computing. They're fun, but they also demonstrate subtle nuances in the way humans react with them, such as tone, intonation and anatomical gesturing. For the first time, computers can learn to speak the 90 percent of human language that is unwritten, right down to the way our pupils dilate.
Getting the plug out of the way, we're told that IBM's OpenPower platform has seen throughput of inference increased by a factor of 40, while GPUs have increased the speed of training a robot by 8.5 times.
If all this sounds familiar, perhaps you watched 2001: A Space Odyssey in your hotel room yesterday too, because the comparisons were downright eerie.
But what separates the world of Watson from the world of HAL 9000 is that Watson doesn't seek to be human. It seeks to complement humans by being the best artificial servant it can be. The end game isn't the singularity, but the parity. µ
Expect to see it in the next Galaxy gadget
Chip will be 40 per cent more power efficient than its 10nm counterpart
Becky, with the good Aire
Chip designer pledges to be 'more confident' and 'more aggressive'