On-device gen-AI to trigger massive 1.8 Billion Heterogenous AI Chipset Shipments by 2030

Date: 22/02/2024
ABI Research forecasts a surge in shipments of heterogeneous AI chipsets, exceeding 1.8 billion units by 2030, driven by the integration of on-device AI functionalities into laptops, smartphones, and various other devices.

Generative AI workloads are no longer confined to the cloud, as advancements in heterogeneous AI chipsets enable on-device inferencing. An abstraction layer facilitates the efficient distribution of AI tasks across different processing architectures, enhancing performance. Compressed Large Language Models (LLMs) with fewer than 15 billion parameters contribute to the efficiency and feasibility of on-device generative AI.

Strong collaborations between hardware and software stakeholders are pivotal in creating unified propositions and fostering the development of productivity-focused applications.

The proliferation of on-device AI capabilities is poised to rejuvenate stagnant markets, including smartphones and PCs, with accelerated shipment numbers expected between 2025 and 2028.

“Cloud deployment will act as a bottleneck for generative AI to scale due to concerns about data privacy, latency, and networking costs. Solving these challenges requires moving AI inferencing closer to the end-user – this is where on-device AI has a clear value proposition as it eliminates these risks and can more effectively scale productivity-enhancing AI applications,” says Paul Schell, Industry Analyst at ABI Research. “What’s new is the generative AI workloads running on heterogenous chipsets, which distribute workloads at the hardware level between CPU, GPU, and NPU. Qualcomm, MediaTek, and Google were the first movers in this space, as all three are producing chipsets running LLMs on-device. Intel and AMD lead in the PC space.”

Source: ABI Research