100m ChatGPT queries per day could require annual purchases of $1 to $2b of H100 PCIe GPUs $NVDA
Bernstein analyst Stacy Rasgon broke down the math on the potential market size and opportunity for Nvidia in the artificial intelligence space.

'Unless you are living in a cave, you must be aware of ChatGPT, OpenAI's machine learning tool that answers users disparate questions with human-like responses. ChatGPT can answer queries. It can write poems. It can tell stories. It may have killed basic take-home essay homework dead,' Rasgon said.

The analyst, who has an Outperform rating and $265 price target on Nvidia shares, said the rise of ChatGPT has investors questioning which companies will benefit and the size of the opportunity.

Rasgon said Nvidia's GPUs are used to train the neural networks responsible for ChatGPT and 'handle the inferencing of the queries themselves.'

The analyst uses a bottoms up approach to size the market opportunity for Nvidia, instead of the top down approach used by others.

'We estimate almost 400 quadrillion operations are needed to accomplish a typical sized ChatGPT query response. Given this, our math suggests a GPU TAM in the multiple tens of billions of dollars annually is potentially plausible once ChatGPT and other language models are at scale.'

The scale for the analyst is a billion queries per day, which is around 10% of Google's typical search volume.

The analyst estimates that 100 million ChatGPT queries per day could require annual purchases of $1 billion to $2 billion of H100 PCIe GPUs.
No comments yetBe the first to add your insight!



Already have an account?