Investing

The AI Boom Is Running Into Physical Limits

Editor’s Note: Every technological revolution runs into the same problem eventually: the real world.

The internet needed fiber-optic cables. Smartphones needed wireless networks. Electric vehicles needed lithium mines and charging stations.

Artificial intelligence is no different.

Right now, AI demand is exploding across the global economy. But behind the scenes, the industry is already colliding with the physical limits of chips, electricity, metals, and data-center infrastructure.

My colleague Eric Fry has been studying these constraints closely. He has spent decades identifying the economic “pressure points” where technology, capital, and supply chains collide – often years before Wall Street fully understands what’s happening.

In today’s issue, Eric explains how the first major bottleneck of the AI boom – a shortage of computing power – helped fuel historic gains in companies like Nvidia and Broadcom. More importantly, he explains why that was likely only the beginning.

Next Wednesday, Eric will expand on this idea during a free online presentation called FutureProof 2026, where he’ll explain why a new wave of infrastructure bottlenecks could shape the next phase of the AI boom – and the companies positioned to benefit. You can reserve your spot for that event here.

For now, I encourage you to read Eric’s essay below. It’s a fascinating look at how the biggest investment opportunities in AI often appear where technology collides with real-world limits – and why the next wave may already be forming.

In November 2023 – just a year after ChatGPT’s debut – OpenAI CEO Sam Altman made a surprising decision. 

He stopped taking new customers.

New signups for ChatGPT’s paid subscriptions were suddenly suspended. Anyone hoping to access OpenAI’s most advanced models was simply turned away.

This wasn’t because demand had collapsed.

It was because demand had exploded. In its first year, ChatGPT reached 100 million weekly active users. And after its first developers conference on November 6, 2023 – where the company unveiled ChatGPT Plus – the number of wannabe subscribers surged rapidly.  

Just a week later, Altman literally had to start turning away folks with credit cards in hand.

Altman summed up the moment with a simple emoticon: the frowny face. 

Typically, “quitting while ahead” is advantageous in debating, gambling, and even trading. But it’s not recommended in business. OpenAI, however, had no choice. Demand for ChatGPT had exceeded the company’s GPU capacity.

One of the most advanced AI companies in the world had run out of “compute.”

Other AI startups ran into the same issue: GPUs were effectively sold out. The AI models were ready, and consumer demand was there, but the hardware to run them at scale wasn’t. GPU supply chains were still running at pre-AI demand levels.

It became the first great bottleneck of the AI era.

There weren’t enough chips, networking components, or infrastructure to power the AI explosion. In effect, the entire industry hit a wall. 

Two companies sat at the center of this compute bottleneck…

And both of these AI infrastructure providers went on to capture enormous gains early in the AI boom.

Today, let’s examine both of these companies – and how they made early investors millionaires.

Then, I’ll reveal how it’s not too late for investors to jump in on AI’s next wave of millionaire-maker bottlenecks.

We’ll get started with Nvidia Corp. (NVDA).

It sounds hard to believe now, but for decades, Nvidia was a terrible stock…

The First AI Bottleneck: Turning a Compute Shortage Into a Gold Mine

Jensen Huang founded Nvidia in the early 1990s to supply graphics processors to the videogame industry. But gaming GPUs were a cyclical business. Investors who bought near the peaks often saw losses of 70% or more during downturns. 

Ten thousand dollars invested in Nvidia in October 2018, for example, was worth just $4,400 less than three months later.

While Nvidia gradually expanded into automotive and mobile chips, it wasn’t until generative AI arrived that the company turned into a great stock.

AI models require enormous computing power to train and operate. And once ChatGPT burst onto the scene, the companies racing to build AI systems suddenly needed vast quantities of high-performance GPUs.

The problem was that almost no one could supply them.

Nvidia’s chips — particularly the H100 — quickly became the gold standard for AI training. As demand surged, the company went from selling $2,000 graphics cards to gamers to selling $30,000 GPUs to data-center operators.

At the peak of the shortage in 2023, some H100 cards were reportedly reselling for more than $40,000 on eBay.

And as the AI boom accelerated, the company’s data center division exploded. Within a year, it became Nvidia’s largest business segment, with AI chips driving the vast majority of the growth.

In effect, Nvidia turned the compute bottleneck into a business moat.

Anyone building advanced AI models needed Nvidia’s GPUs.

Today, more than two years later, nearly every major AI company still depends on them.

Investors who recognized the compute bottleneck early were richly rewarded. Going back to the launch of ChatGPT as the start of the AI compute constraint, Nvidia shares have surged nearly 1,000%.

That’s the power of getting positioned before a bottleneck breaks open.

But GPUs were only part of the story…

Source link

Share with your friends!

Leave a Reply

Your email address will not be published. Required fields are marked *

Get The Latest Investing Tips
Straight to your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.