Microsoft and Alphabet aren’t the only companies poised to benefit long term from the artificial intelligence race overtaking Wall Street and the tech community. Chip stocks driving AI queries could mark a multi-billion dollar annual market opportunity for the industry in the years ahead, and Nvidia is one of the best names to play the burgeoning trend, Bernstein said. “We estimate almost 400 quadrillion operations are needed to accomplish a typical sized ChatGPT query response (say ~500 tokens, or ~2000 words),” wrote analyst Stacy Rasgon in a Monday note to clients. “Given this, our math suggests a GPU TAM in the multiple tens of billions of dollars annually is potentially plausible once ChatGPT and other large language models are at scale,” he added. Nvidia is no stranger to AI and is regarded as one of the biggest frontrunners in the semiconductor industry. For years, Nvidia’s created AI-powered chips used for applications such as gaming and machine learning. Shares skyrocketed nearly 9% last week after Nvidia posted a slight fiscal fourth-quarter beat boosted by AI chip demand. NVDA YTD mountain Nvidia shares so far this year The company expects these AI tailwinds to continue, with CEO Jensen Huang saying during a call with analysts last week that AI is at an ” inflection point ,” forcing businesses to to buy chips and develop new AI strategies. The stock is up more than 61% in 2023. According to Bernstein’s calculations, 100 million ChatGPT queries a day would require annual purchases of $1 billion to $2 billion worth of Nvidia’s H100 PCIe chips. At one billion queries a day, expect $10 billion to $20 billion worth of purchases, Rasgon said. “As a sanity check on the math, we estimate the GPU cost per query at a few cents per … which seems reasonable,” he added, highlighting Nvidia’s Ampere, or A100 generation chips, and Hopper, or H100 generation chips, as preferred offerings to handle large language model loads. Add Google’s search queries to the mix, or roughly 10 billion a day, and Rasgon anticipates $150 billion worth of potential H100 graphics processing unit sales annually. “Presumably the cost curve will slope downward as the scale of large language model adoption rises, but the multi-fold expansion of GPU market feels inevitable if [large language model] usages (whether for search or otherwise) becomes a sustained thing,” Rasgon wrote. — CNBC’s Michael Bloom contributed reporting