Cloud implementation has been the province of Artificial Intelligence (AI), but there’s a big shift on the horizon as AI moves onto to the edge, either on-device, on the gateway, or on on-premise server, That’s the finding of a recent report titled “Artificial Intelligence and Machine Learning” by market research firm ABI Research.
According to the report, the shift will happen initially in terms of inference (machine learning) and then by training. This will create an opportunity for those chipset vendors with power-efficient chipsets and other products that can meet the demand for edge AI computing. Edge AI inference will grow from just 6 percent in 2017 to 43 percent in 2023, says ABI Research in the report.
“The shift to the edge for AI processing will be driven by cheaper edge hardware, mission-critical applications, a lack of reliable and cost-effective connectivity options, and a desire to avoid expensive cloud implementation,” says Jack Vernon, Industry Analyst at ABI Research, in the report. “Consumer electronics, automotive, and machine vision vendors will play an initial critical role in driving the market for edge AI hardware.
Markets ripe for the adoption of AI include automotive, mobile devices, wearables, smart home, robotics, small unmanned aerial vehicles, smart manufacturing, smart retail, smart video, smart building, and oil and gas sectors. By 2023 the market will witness 1.2 billion shipments of devices capable of on-device AI inference – up from 79 million in 2017, according to the report.
The cloud will still be a key market for AI device shipments, but its share of the overall market will decline. Out of the 3 billion AI device shipments that will take place in 2023, over 2.2 billion will rely on cloud service providers for AI training. Hardware providers should not be too concerned about this shift away from the cloud, as AI training is likely to be supported by the same hardware, only at the edge, either on-premise servers or gateway systems, ABI Research says.
Chip vendors are moving to AI. Mobile vendor Huawei is introducing on-device AI training for battery power management in its P20 pro handset, in partnership with Cambricon Technologies. Vendors such as NVIDIA, Intel, and Qualcomm are pushing to deliver the hardware that will enable automotive OEMs to experiment with on-device AI training to support their efforts in autonomous driving.
“The massive growth in devices using AI is positive for all players in the ecosystem concerned, but critically those players enabling AI at the edge are going to see an increase in demand that the industry to date has overlooked,” says Vernon. “Vendors can no longer go on ignoring the potential of AI at the edge. As the market momentum continues to swing toward ultra-low latency and more robust analytics, end users must start to incorporate edge AI in their roadmap. They need to start thinking about new business models like end-to-end integration or chipset as a service.”
Filed Under: M2M (machine to machine)