ARM IS ARMING a brace of processor designs with artificial intelligence (AI) tech, jumping on-board the machine learning train.
The two processor architectures will be brand-new rather than based on previous ARM designs. The first will take aim at supporting machine learning (ML) workloads, while the other has been designed for object detection (OD).
The former, dubbed ARM ML for the moment, will provide machine learning capabilities on a mobile processor with the goal of delivering more than 4.6 trillion operations per second, which in English means the ability to power AI systems and software on a device locally.
The second, the ARM OD processor, has been specifically designed to identify people and objects in real-time and at Full HD running at 60 frames per second.
ARM said the processor will deliver 80x the performance of traditional digital signal processors and beat the performance of its other chips in terms of object detection quality.
Of course, ARM won't make these processors as it'll license out the designs to the likes of Qualcomm, Apple and Samsung.
With the new architectures, ARM is no doubt hoping that image recognition features that use powerful sensors and smart tech, like the iPhone X's Face ID functionality, will filter down to devices that aren't so punishing on the wallet.
ARM also hopes the chips will support the use of AI on devices rather than have them reliant on the cloud and internet connections, as the increased use of smart virtual assistants and other AI services could hoover up bandwidth until there's no more to be had. Forget Kim Kardashian's naked posterior, it's AI that could truly 'break the internet'.
Both the chips fall under ARM's Project Trillium, which comprises not only of the chips but ARM's NN software, which is designed to support the running of neural networks on ARM chips and helping AI breakout from data centres and server stacks and slide into all manner of devices.
"The rapid acceleration of artificial intelligence into edge devices is placing increased requirements for innovation to address compute while maintaining a power efficient footprint," said Rene Haas, president of ARM's IP Products Group.
"New devices will require the high-performance ML and AI capabilities these new processors deliver. Combined with the high degree of flexibility and scalability that our platform provides, our partners can push the boundaries of what will be possible across a broad range of devices."
With that in mind, we can expect to see ARM ML processors find their way into SoCs from the likes of Qualcomm that already make use of ARM Cortex processors, adding a dose of dedicate machine learning processing that can improve the smart of phones, tablets and other smart devices.
The ARM OD processor will likely follow the same path as the ARM ML but could also find its way into smart cameras, drones and driverless car systems where being able to detect and process objects in real-time is needed for said devices to be effective.
ARM is by no means alone in stuffing AI smarts onto slices of silicon, as Amazon is reportedly designing its own AI chip to boost Alexa's brains. But ARM-based chips can be found all over the place, so there's a real potential for the UK company to make smart tech more ubiquitous and AI more readily accessible than ever before. µ
Go Google Go, Go, Google be good
Lack of cheap mobes and slowing upgrade cycles to claim
The Galaxy S9 and S9+ ain't going to be cheap
Probably not, but we can dream