IBM HAS REVEALED an update to its human brain inspired Systems of Neuromorphic Adaptive Plastic Scalable Electronics (Synapse) project in a production-ready chip, which it said - in a 'crazy scientist' voice, we are sure - could transform the world.
Unveiled in 2011, the technology - which mimics both the size and power of humanity's most complex organ - looks to solve the problems created by traditional computing models when handling vast amounts of high speed data.
IBM has scaled-up the preceding single-core prototype to bring the computer chip to a scale of one million programmable neurons, making it the first neurosynaptic chip to achieve 256 million programmable synapses and 46 billion synaptic operations per second per watt.
"A neurosynaptic supercomputer the size of a postage stamp that runs on the energy equivalent of a hearing-aid battery, this technology could transform science, technology, business, government, and society by enabling vision, audition, and multi-sensory applications," IBM said.
The Synapse update is a fully functioning production-scale chip with 5.4 billion transistors, and IBM claimed it is one of the largest CMOS chips ever built, and runs at biological real time, consuming just 70mW. That's less power than a modern microprocessor.
"There is a huge disparity between the human brain's cognitive capability and ultra-low power consumption when compared to today's computers. To bridge the divide, IBM scientists created something that didn't previously exist-an entirely new neuroscience-inspired scalable and efficient computer architecture," IBM explained.
IBM's second generation chip is the culmination of almost a decade of research and development, including the initial single-core hardware prototype in 2011 and software ecosystem with a new programming language and chip simulator in 2013.
Around a year ago IBM unveiled another update in its plans to generate a computer system that copies the human brain, calculating tasks that are relatively easy for humans but difficult for computers.
As part of the Synapse project, IBM said that its researchers were working with Cornell University and Inilabs to create the programming language with $53m in funding from the Defense Advanced Research Projects Agency (DARPA).
The new programming language, perhaps not in layman's terms, "breaks the mould of sequential operation underlying today's von Neumann architectures and computers" and instead "is tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures".
That, in English, basically means that it could be used to create next generation intelligent sensor networks that are capable of perception, action and cognition, the sorts of mental processes that humans take for granted and perform with ease.
IBM suggested that potential uses for this technology could include a pair of glasses which assist the visually impaired when navigating through potentially hazardous environments. Taking in vast amounts of visual and sound data, the augmented reality glasses would highlight obstacles such as kerbs and cars, and steer the user clear of danger.
Other uses could include intelligent microphones that keep track of who is speaking to create an accurate transcript of any conversation.
In the long term, IBM hopes to build a cognitive computer scaled to 100 trillion synapses. This would fit inside a space with a volume of no more than two litres while consuming less than one kilowatt of power. µ
Panic over: Jury decides that Google’s use of Java APIs in Android was 'fair use' and, hence, absolutely fine
24-hour ad blocking frenzy to take place in June
Evidence binned as FBI declines to unbuckle
Or Galaxy Note 7, who knows