In case anyone had doubts, the AI trade is just getting started. Nvidia announced Monday afternoon an agreement to invest $100 billion to help OpenAI in the coming years build out 10 gigawatts of artificial intelligence data center capacity. The money will be incrementally allocated as each gigawatt is deployed. We have always seen plenty of runway in the buildout of AI data centers and the role of Nvidia’s chips at the center of it all. But the scale of the opportunity described in this partnership is incredible. Nvidia CEO Jensen Huang, OpenAI CEO Sam Altman, and OpenAI President Greg Brockman broke the news in an interview with CNBC’s Jon Fortt on ” Fast Money Halftime Report .” They were all gathered at Nvidia headquarters in Santa Clara, California. Huang was asked if the benefits from this announcement have been accounted for in previous projections from the company, and he replied: “This is additive to everything we spoke about so far.” In other words, analysts will need to go back and update their forecasts for 2026 and beyond to factor in this massive investment. “This is the biggest AI infrastructure project in history, this is the largest computing project in history,” Huang added. “The reason for that is because the computing demand is going through the roof for OpenAI.” Per the press release on the deal, “The first phase is targeted to come online in the second half of 2026 using the Nvidia Vera Rubin platform.” NVDA 5Y mountain Nvidia 5-year performance Shares of Nvidia flipped to positive on the news and were soaring to an intraday all-time highs north of $184, pushing the market value of the stock to nearly $4.5 trillion. Nvidia’s record-high close of $183.16 per share was on Aug. 12. The message from both Huang and Altman on CNBC was clear and consistent with what they have been saying since the launch of ChatGPT back in late 2022: The world needs significantly more computing power if their vision for what AI can be is to be realized. Nvidia has been a partner of OpenAI since those early days. The investment announced Monday will complement the AI infrastructure work that Nvidia and OpenAI are doing with Microsoft , Oracle , SoftBank, and the Stargate project. “This will expand on the Stargate ambitions and let us push further and further. We have found at every step along the way, we did not quite set our sights big enough, given the market demand,” Altman said on CNBC. “This will help push towards that next level. The compute constraints that the whole industry has been in, and our company have been terrible. We’re so limited right now in the services we can offer. There’s so much more demand than what we can do.” Perhaps, the most bullish comment of the CNBC interview for long-term investors may have come from Brockman, who said, “One way to contextualize the scale of what we’re talking about and the compute scarcity of the world that we’re heading towards, you know ChatGPT today, you talk to it and it gives you answers. But clearly you want an agent that’s going to go do work for you proactively while you’re asleep. … And so, you really want every person to have their own dedicated GPU. Right, so, you’re talking order of 10 billion GPUs we’re going to need. This deal, we’re talking about, it’s for millions of GPUs. Like, we’re still three orders of magnitude off of where we need to be.” Putting the scope of this particular deal into context, Huang noted that 10 gigawatts of data center capacity amounts to about 4 million to 5 million GPUs. “Approximately, in one project,” it’s what Nvidia “shipped all year this year,” he added. Nvidia’s GPUs, or graphics processing units, are the gold standard when it comes to AI. “This is the first 10 gigawatts, I assure you of that,” Huang said to close out the interview. You can’t hear the word gigawatt and not think about GE Vernova and Eaton , two names we highlighted last week as key beneficiaries of the growing need for more energy to support the massive amount of computing power set to come online over the next decade. Shares of both rightfully jumped roughly 3% on the news. The AI buildout is an arms race, which should fuel further upside in the years to come, not only for Nvidia but for those in the AI ecosystem – think energy plays and cloud providers – that will play a crucial role in powering these multi-gigawatt buildouts. (Jim Cramer’s Charitable Trust is long NVDA, MSFT, ETN, GEV. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.