"SCIENCE AND TECHNOLOGY are still in their infancy, no matter how many times somebody comes along and says it's the end of science," says Stan Williams. "There is far more out there than we have yet found."
Williams was talking to The INQUIRER in May 2008 after several weeks of sudden media attention and more than 10 years of effort since founding the lab he directs at HP, the information and quantum systems lab. The reason for the attention: Williams and his team had found the missing fourth element of circuit design, the memristor, which was originally predicted in a paper written by Leon Chua in 1971.
Fast forward six years, and the memristor has been unveiled as the lynchpin of HP's plans for The Machine, a new computing platform the firm claims will revolutionise the technology world.
The story started, said Williams, with a year or more of thinking when they founded the lab: what would computing look like in 2010? Back in that distant past of four years ago, they thought that transistors would be getting smaller, to the point where the size of individual atoms would make a difference.
"That got our attention, and we started thinking very carefully about what that means. What would be the impact of electronic devices so small that one atom more or less could make a difference in the properties of the device? That pushed us out of the box in terms of being open to very different things and thinking about very different issues."
As they were investigating molecular electronics, they started seeing hints of an unexpected effect in their experiments. It was, said Williams, a staffer named Greg Snyder who rediscovered, read, and understood Chua's paper. Once Williams had understood it – "Leon Chua is a very modest man, but it's quite heavy mathematically and a challenge to get through" – he made the connection between Chua's work and what they were seeing in the lab. From there, it took them about a year to understand the physics.
"Once we got it, we saw that in fact so much of what we were seeing and so much of what other people had reported in the literature for years and years was actually memristance, but without the physics model they didn’t understand it. The main thing we did is we figured out where it's coming from, why it's important, and why it's becoming more important." The effect, he said, gets stronger as the devices get smaller. "Memristance is not a quantum effect, but it's another effect that becomes more important as things get smaller."
At the 2008 etech conference, Williams talked about computer science as a series of roads not taken. The path we followed for the last 50 years, he said, derives from Claude Shannon's observation that series and parallel switches could implement Boolean logic. Go back further and read Bertrand Russell's 1910 Principia Mathematica and you find other forms of logic that could be implemented.
"Memristor essentially enables some of those other tracks," says Williams. " And to me it's the example that there's plenty more room at the bottom."
Besides implementing other forms of logic, Williams believed at the time of their revelation that the key characteristics of memristors – that they retain their memory even when powered off, like a hard drive or ROM, but can be rewritten dynamically, like RAM – would enable far more energy-efficient designs and continue the functional progression of Moore's Law. He imagined a future of hybrid circuits, but also thought that memristors would function not just as digital switches but as electronic synapses far more like their biological counterparts than those built with traditional semiconducts – and far smaller and less power-hungry.
A "thinking brain", he said, is "very, very far out". But the analogue computers to be built with these devices would actually learn from their environment and be more competent at human-style pattern recognition.
"The age of computing has not yet begun," he said in 2008. "What we have now makes the computers that existed 50 years ago look like toys – and not very good ones. My view is that what we'll have in 50 years will make what we have now look very quaint and toylike." But, he added, "Even after 50 years we won't have anything that looks remotely like a human brain." µ
Sign up for INQbot – a weekly roundup of the best from the INQ