Semiconductor demand from generative AI leaders begins a gold rush in AI inference
June 3, 2023
- Share:
Tech giants cannot get enough of customized AI semiconductors in their data centers, nor can the startups creating the future of AI models.
We have documented the non-linear positive relationship between additional computing resources and AI model quality over the past four years that drove the AI GPU market to reach $5.5 billion in 2022.
With the breakthrough in model quality from ChatGPT, cloud buyers and AI research labs accelerated their capital expenditure plans, spurring Nvidia to increase its revenue guidance above expectations by 55.5%, or $3.9 billion, for its next quarter.
Even so, Nvidia does not own the entire AI value chain and VC investors have already begun shifting their focus to categories in which startups can carve out market share. Hyperscalers' custom chips and startups can outperform the chip giant on specific inference tasks that will increase in importance as large language models are rolled out from cloud data centers to customer environments.
Our analyst note Inferring the Future of AI Chips from Q4 2022 details the use cases where startups can surpass Nvidia in AI inference as generative AI adoption grows.
The note also reveals the demand driving Nvidia's automotive growth to exceed even its data center business. We expect further startup innovation to drive that market forward in parallel with generative AI.
Don't hesitate to reach out to discuss any of these topics further.
We have documented the non-linear positive relationship between additional computing resources and AI model quality over the past four years that drove the AI GPU market to reach $5.5 billion in 2022.
With the breakthrough in model quality from ChatGPT, cloud buyers and AI research labs accelerated their capital expenditure plans, spurring Nvidia to increase its revenue guidance above expectations by 55.5%, or $3.9 billion, for its next quarter.
Even so, Nvidia does not own the entire AI value chain and VC investors have already begun shifting their focus to categories in which startups can carve out market share. Hyperscalers' custom chips and startups can outperform the chip giant on specific inference tasks that will increase in importance as large language models are rolled out from cloud data centers to customer environments.
Our analyst note Inferring the Future of AI Chips from Q4 2022 details the use cases where startups can surpass Nvidia in AI inference as generative AI adoption grows.
The note also reveals the demand driving Nvidia's automotive growth to exceed even its data center business. We expect further startup innovation to drive that market forward in parallel with generative AI.
Don't hesitate to reach out to discuss any of these topics further.

Brendan Burke
Senior Analyst, Emerging Technology
- Share:
-
-
-
-
Tags:
Join the more than 1.5 million industry professionals who get our daily newsletter!