Branche, teknologi, komponentdistribution

 

On-device generativ AI sætter fart på markedet for AI-chipsæt

On-device generativ AI sætter fart på markedet for AI-chipsæt

Bevægelsen mod on-device generativ AI vil ifølge analysefirmaet ABI Research drive markedet for AI chipsæts til over 1,8 mia. enheder i 2030 (in english).

Del artiklen på

Generative Artificial Intelligence (AI) workloads have moved beyond the bounds of cloud environments and can now run on-device supported by implementing heterogeneous AI chipsets.

Combined with an abstraction layer that can efficiently distribute AI workloads between processing architectures and compressed LLMs with under 15 billion parameters, these chipsets can enable enterprises and consumers to run generative AI inferencing locally.

Consequently, ABI Research, a global technology intelligence firm, estimates worldwide shipments of heterogeneous AI chipsets will reach over 1.8 billion by 2030 as laptops, smartphones, and other form factors will increasingly ship with on-device AI capabilities.

- Cloud deployment will act as a bottleneck for generative AI to scale due to concerns about data privacy, latency, and networking costs. Solving these challenges requires moving AI inferencing closer to the end-user – this is where on-device AI has a clear value proposition as it eliminates these risks and can more effectively scale productivity-enhancing AI applications, says Paul Schell, Industry Analyst at ABI Research.

- What’s new is the generative AI workloads running on heterogenous chipsets, which distribute workloads at the hardware level between CPU, GPU, and NPU. Qualcomm, MediaTek, and Google were the first movers in this space, as all three are producing chipsets running LLMs on-device. Intel and AMD lead in the PC space.

Hardware alone will not be enough. Building a solid on-device AI value proposition requires strong partnerships between hardware and software players to create unified propositions.

These collaborations will nurture the development of productivity-focused applications to be deployed on-device. ABI Research expects this will spur demand and shorten replacement cycles of end devices like smartphones and PCs. This will lead to accelerating shipment numbers between 2025 and 2028 as the software ecosystem matures, breathing new life into markets that have been stagnating. Automotive and edge server markets are also impacted but to a lesser extent.

The productivity AI applications running on-device, powered by heterogeneous AI chipsets, will drive significant market growth in personal and work devices. This is reflected by the increasing penetration of heterogeneous AI chipsets, eventually encompassing most systems towards the end of the decade.

- Chip vendors and OEMs should look to expand the productivity AI application ecosystem to tempt more customers and mature the offering. This will create opportunities analogous to the growth previously spurred by the expansion of Android and web-based applications in their respective markets and require reaching a critical mass of applications that appeal to a broad range of customers in consumer and enterprise markets. Success in creating popular and useful applications could make or break the transition to on-device AI.

27/2 2024
Produktlinks
Find din leverandør:

Mekanik underleverandør

Test & Måleudstyr

Test laboratorier

Tilføj dit firma

Webinarer, events og messer

Mest læste


Læs magasinet

Læs magasinet

Nyeste Nyheder