Arm wants to help IoT and other embedded devices to think for themselves. Today the company unveiled two chips designed to eliminate the reliance on cloud-based artificial intelligence (AI) by delivering machine learning (ML) capabilities right on the device.
"Enabling AI everywhere requires device makers and developers to deliver machine learning locally on billions and ultimately trillions of devices," said Dipti Vachani, SVP and general manager of Arm's automotive and IoT line of business, in a statement.
The Cortex-M55 processor is the company's first to leverage the Armv8.1-M architecture and features Arm's Helium vector processing technology, which is designed with ML and digital signal processing in mind.
Meanwhile, the company's Ethos-U55 — its first micro neural processing unit — can work alongside the Cortex M55 to accelerate even more demanding ML workloads. According to Arm, the Ethos-U55 was designed specifically with embedded and IoT devices in mind and uses "advanced compression" techniques to save power and reduce ML model sizes, which enables neural networks previously only possible on larger systems.
Both chip designs are compatible with the company's Cortex-M software toolchain, which aims to ease the development of digital signal processing and ML workloads.
Edge vs. CloudWhile AI has traditionally taken place in the cloud, Arm is confident the next phase will see AI move to the edge of the network and into endpoints. During a keynote at Arm TechCon in October, Arm CEO Simon Segars said while it's possible to offload AI workloads to the cloud, it isn't very efficient and can't scale. Instead, he said the next generation of IoT devices will handle AI and ML workloads locally.
Arm claims that as IoT intersects with AI and 5G, on-device intelligence will lead to smaller, lower-cost devices that will enjoy better privacy and reliability thanks to a reduced reliance on cloud computing.
Early ApplicationsIn a blog post, Vachani said there's no limit to the types of applications that could benefit from integrated machine learning capabilities, and not all of them are limited to traditional IoT and edge devices.
He explained these applications range from autonomous cars to assistive devices such as smart walking canes for the blind. While there are several assisted walking sticks already in development, most rely on ultrasonic imaging that is limited in range and resolution. Using Arm's newly announced ML chip designs, however, Vachani said it would be possible to replace the ultrasonic sensors with a lower-power, higher-resolution 360-degree camera.