- The booming AI chip sector is set to continue its stellar trajectory, growing in value by $100bn in the next five years
- Investment in datacentre infrastructure remains historically high and is set to continue
- Investments in GPUs will continue to dominate the sector but CPUs will also play a key role, according to a report from Futurum Group
- By 2028 the top three AI use cases are expected to be visual and audio analytics, simulation and modelling, and text generation, analysis and summary
The eye-watering demand for AI chips that has catapulted Nvidia into the fiscal stratosphere in the past year is set to continue, with the global market for processors and accelerators for AI applications in the datacentre market set to soar in value from $38bn last year to $138bn in 2028, a compound annual growth rate (CAGR) of 30%, according to a recent AI chipset market forecast report from research firm Futurum Group.
That, of course, is great news for Nvidia, currently the clear market leader in the AI chip sector: It can’t currently keep up with the demand for its graphics processing units (GPUs). In 2023, the value of GPU sales for datacentre AI deployments was $28bn (74% of the market total), with Nvidia capturing 92% of that sector. But there’s even better news ahead, as this particular sector is expected to grow in value at a compound annual growth rate (CAGR) of 30% during the next five years to be worth $102bn in 2028.
But it’s not all about GPUs, as the Futurum team notes: In 2023, GPU sales accounted for 74% of the total value of the datacentre AI chip market, but CPUs (central processing units) accounted for about 20%, or $7.7bn of the $38bn total. And CPUs will continue to play an important role, with the value of that segment growing to be worth $26bn in 2028.
And while Nvidia is the current leading light of the sector, it’s not going unchallenged. “We are witnessing the most profound technological revolution with the advent of AI and its supporting semiconductors solutions,” stated David Newman, CEO of the Futurum Group. “As AI innovation evolves, companies such as Nvidia, AMD and Arm are seeing significant revenue growth, but the market’s competitive landscape is expected to intensify with new entrants and startups poised to capture market shares and drive continued innovation.”
The report – the full title of which is ‘AI Processors and Accelerators Used in Data Center Market for AI Applications: 1H 2024, Global Market Sizing and Forecast Report, 2023 - 2028’ – analyses the role of 18 chip developers (including Intel and Qualcomm, of course) and includes valuations for custom cloud accelerators and dedicated accelerators (XPUs) – both of which currently command 3% of the total market value – as well as GPUs and CPUs.
A large chunk of the demand for processors and accelerators for AI applications in the datacentre market comes from the hyperscale cloud services companies such as Amazon Web Services (AWS), Google, Microsoft and Oracle – combined they were responsible for 43% of the market’s value in 2023 and the Futurum team expects that to rise to 50% by 2028. (The team notes, though, that processors and accelerators designed specifically for AI applications run by the likes of Apple, Meta, Tesla and so on, have been excluded from its research as they are used only for private datacentre processing.)
The Futurum team also notes that further shows that the market capitalisation of the AI chip sector is more than $5tn, accounting for about 30% of the weighting of the S&P 500 Index, a market value-weighted indicant of 500 leading publicly traded companies in the US: Whilst not an exact reflection the top 500 companies, it has a strong and well-earned reputation as one of the best gauges not only of the performance of prominent US companies but also of the overall stock market itself.
Trained AI inference engines will analyse new data and draw conclusions without human intervention
AI, or subsets of it, have been under development for many years but the sudden arrival of ChatGPT in late 2022 galvanised the sector and kicked the AI hype machine into overdrive. Much of the subsequent coverage in the popular media has been overblown flummery but the adoption of various aspects of AI is now a full-on trend that is already fundamentally changing sectors such as datacentres, the development of new drugs and chemicals, education, finance, healthcare and manufacturing. Cloud companies are underpinning such developments by providing AI processing capabilities and services as both standalone services or integrating them into their products.
Datacentres are key to managing the very heavy AI workloads involved in massive data processing, the management of pattern recognition and the training of neural networks. Datacentre operators are under immense pressure to keep pace with developments in AI whilst, simultaneously, trying desperately to optimise energy usage. The demand for ever more energy to power AI is, in many cases, spiralling to the point where some datacentres are close to meltdown.
The importance of, and investment in, the global datacentre sector is driving AI chip manufacturers to devise innovative ways to enhance the performance of their processors, one of which is AI accelerators, hardware designed specifically to tweak the performance of AI algorithms to increase the speed of execution of highly complex computations. Often referred to as deep learning processors, or neural processing units (NPUs), they are optimised for data-driven parallel computing. That makes them exceptionally efficient at processing multimedia data (images and videos) and processing data for neural networks. They are particularly adept at handling AI-related tasks, such as speech recognition, background blurring in video calls, and photo or video editing processes like object detection.
Meanwhile, AI ‘inference’ involves the use of ‘inference engines’ that apply logical rules to the knowledge base to evaluate and analyse new information. In machine learning (ML), such systems are ‘trained’ via the recording, storing and labelling of information. When the training is deemed to be complete, the intelligence gleaned is applied to enable the machine to understand completely new data and draw conclusions from it without any human input or intervention.
The Futurum report shows that, currently, the biggest use case for AI chipsets is in enabling visual and audio analytics (24%), followed by simulation and modelling (21%). The fastest growing use cases are object identification, detection and monitoring, closely followed by conversational AI.
The report covers 17 different verticals where manufacturing and industrial and media and entertainment each control an 11% market share. However, the report forecasts that IT and telecom will grow at the fastest rate (about 40%) during the next five years, closely followed by financial services and insurance (39%) and healthcare (37%). By 2028 it is expected that the top three use cases will be visual and audio analytics, simulation and modelling, and text generation, analysis and summary.
Email Newsletters
Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.