Is the Next AI Gold Rush Happening Beneath the Ocean?
Underwater data centers are emerging as a powerful – and underappreciated – edge
One iron rule governs every market cycle: follow the money.
Right now, that money is flooding into AI infrastructure at an unprecedented pace. Hyperscalers are on track to spend more than $600 billion this year alone building the compute backbone of the AI economy. And as always, the biggest profits are flowing to the suppliers enabling that buildout.
But capital is always on the move, hunting for the next bottlenecks and finding new frontiers.
And one of the most important new frontiers in AI infrastructure is something most investors aren’t even looking at yet: underwater data centers.
It sounds unconventional. Yet it may be the next major evolution in how AI infrastructure is built – and one of the most overlooked investment opportunities in the entire AI trade.
Underwater Data Centers: A New Solution to AI’s Cooling Problem
Data centers are the critical component of this buildout. They are the physical backbone of the entire AI economy – where models are trained, deployed, and scaled.
But traditional data centers have one enormous structural cost problem: cooling. Keeping servers from melting requires industrial-scale HVAC systems that account for, on average, 40- to 50% of a data center’s total power consumption. That’s nearly half of every dollar spent on electricity going toward fans, chillers, and cooling towers rather than actual computation.
Some forward-thinkers, notably Elon Musk, believe outer space holds the key to solving this problem. In theory, space offers near-infinite solar energy and a naturally cold vacuum environment – both ideal for powering and cooling high-density compute. But for now, the cost, latency, and logistical complexity remain prohibitive.
That’s why others are pointing somewhere much closer to home: the ocean – and, more specifically, underwater data centers.
Seawater is free, plentiful, and cold. By submerging GPUs in the ocean, you eliminate the cooling problem almost entirely – reducing cooling’s share of total energy consumption from 40- to 50% down to roughly 10%. The result is a Power Usage Effectiveness (PUE) rating around 1.15, versus the 1.4 to 1.6 typical of even well-designed land-based facilities. That is a meaningful efficiency gap that translates directly to lower operating costs and lower carbon emissions.
Underwater Data Centers Are Already Operating at Scale
China, as it tends to do with emerging technologies, has moved first and moved fast. Two major underwater data centers are now operational:
- The Hainan Cluster: The world’s first commercial underwater data center, operated by Beijing Highlander/HiCloud off the coast of Lingshui County, Hainan Province. Currently serving live clients including China Telecom, Tencent, and SenseTime – running AI inference at 7,000 simultaneous queries per second. Already expanded to a second module in February 2025.
- The Shanghai Lin-Gang Wind-Powered Facility: Completed in October 2025 at a cost of $226 million. The world’s first wind-powered underwater data center, drawing 97% of its electricity from nearby offshore wind farms and targeting a PUE of 1.15. Phase one capacity: 198 server racks of AI-capable compute.
The Constraint That Delayed Adoption
Microsoft spent nearly a decade researching this concept under the banner of Project Natick, which found that submerged servers had a failure rate of just 0.7% versus 5.9% on land – an eightfold reliability improvement attributed to sealed, nitrogen-filled environments and the absence of human interaction. The company ultimately shelved the project – not because it failed, but because it arrived too early.
“The technology worked. The business case didn’t.” – Microsoft, 2024.
Fast forward 18 months, and that calculus is changing.
The challenge that killed Microsoft’s commercial ambitions – the inability to quickly swap GPUs inside sealed pods – is a real constraint. But it is also a constraint that matters far less for specific workloads: stable inference serving, long-term archival storage, and edge compute at coastal density hubs. And importantly, these are large and rapidly growing markets.
Where Underwater Data Centers Actually Work
Underwater compute is a powerful niche complement to – not replacement of – terrestrial infrastructure; one that could attract significant capital because it solves specific, expensive problems that land cannot.
Two use cases are most compelling.
Coastal City AI Inference
Approximately 50% of the global population lives within 60 miles of a coastline. AI inference – serving real-time responses from already trained models – is latency-sensitive but far less dependent on constant hardware upgrades than training workloads. A subsea pod deployed offshore can serve dense coastal populations with lower latency and dramatically lower energy costs than a landlocked data center. Think of it as underwater edge computing: a permanent, high-density, energy-efficient inference node anchored to the ocean floor near the cities that need it most. As AI inference demand scales from millions to billions of daily interactions, the economics of coastal subsea compute improve significantly.
Long-Term Storage and Archival Compute
Storage hardware has a seven- to 10-year lifecycle – far longer than the 18- to 24-month GPU refresh cycle that makes sealed pods problematic for training workloads. That means that for archival data, compliance storage, digital sovereignty vaults, and frozen-model repositories, underwater data centers are close to ideal: lower power costs, higher hardware reliability, zero land footprint, and a stable thermal environment that extends hardware life. The world is generating data at an exponential rate, and it must be stored somewhere. Doing so under the ocean is increasingly cost-competitive.
Energy Co-Location
Beyond compute and storage, there’s a third driver emerging here. The world is building massive offshore wind capacity – hundreds of gigawatts over the next decade, particularly in Europe and the U.S. East Coast. The fundamental problem with offshore wind is transmission loss: you generate power 10 to 30 miles out at sea, then pay to move it onshore. An underwater data center co-located with an offshore wind farm would eliminate that transmission cost entirely. The wind farm becomes both the power source and the cooling infrastructure, in one integrated coastal deployment. China’s Shanghai facility is already doing exactly this.
Realistically, the total addressable market for this niche over the next five to 10 years could reach tens of billions of dollars – potentially approaching $50- to $150 billion depending on adoption rates. It’s a fraction of the overall AI infrastructure buildout – but a fraction of trillions is still a very large number.
More importantly, because underwater compute is capital-intensive and technically complex, it will attract a relatively small number of specialized suppliers who will capture outsized margins. That is exactly the dynamic that made terrestrial AI infrastructure suppliers so profitable.
How to Invest In the Underwater Data Center Supply Chain
The best way to invest in a new infrastructure wave is to own the supply chain.
We saw how it worked with terrestrial AI compute. The infrastructure layer – Nvidia (NVDA) for chips, Vertiv (VRT) and Eaton (ETN) for power, Arista (ANET) for networking, Corning (GLW) for fiber – delivered spectacular returns for investors who understood this dynamic early.
That same logic applies to underwater compute. But that supply chain has a crucial difference: the most differentiated, highest-alpha layer is drawn almost entirely from the offshore oil and gas supply chain – not from tech.
These are companies that AI investors have largely ignored because they don’t show up in any AI infrastructure ETF.
That’s the opportunity.
Layer 1: Existing AI Infrastructure (Limited Upside)
Underwater data centers use the same compute hardware: Nvidia GPUs, Western Digital (WDC) and Seagate (STX) storage, Arista and Cisco (CSCO) networking. No incremental investment thesis here. Don’t buy these stocks for the underwater story if you already own them for the land story. They benefit marginally at best.
Layer 2: Partial Beneficiaries of Underwater Data Centers
Corning supplies fiber optic cable for both terrestrial and subsea applications. The underwater leg is a separate, higher-ASP product. Teledyne Technologies (TDY) makes subsea-rated hermetic connectors and penetrators with genuine marine heritage from defense and oceanographic work. Xylem (XYL) provides water quality and thermal monitoring around deployed modules. We expect these two will be legitimate secondary beneficiaries.
Layer 3: Pure-Play Winners In Subsea Infrastructure
This is where the investment case is most compelling. Every company in this layer built its core competency serving offshore oil and gas platforms. They know how to seal modules against saltwater corrosion, deliver power to the ocean floor, and install large structures in open water. Underwater data centers present the same engineering challenge with a different payload. These companies are now pivoting toward offshore wind and underwater compute as their next growth chapter – and the AI investor community has not yet connected the dots:
- Prysmian Group (BIT: PRY) is the world’s largest cable manufacturer with ~40% of the submarine cable market. It makes both the subsea power cable that delivers electricity from offshore wind to underwater pods and the fiber optic cable that connects them to terrestrial networks. Prysmian has a multi-year order backlog and strong pricing power – the single most direct publicly traded play on underwater data center infrastructure.
- NKT A/S (CPH: NKT) absorbed ABB’s high-voltage cable unit and now has ~45% of its high-voltage backlog in offshore wind. It looks to be the purest play on the subsea high-voltage direct current (HVDC) infrastructure that powers coastal underwater compute deployments.
- Nexans (EPA: NEX) is the No. 2 subsea power cable manufacturer, with the highest proportional offshore wind revenue exposure of the major European cable makers. Complementary to Prysmian with a different geographic mix.
- TechnipFMC (NYSE: FTI) specializes in offshore engineering and subsea systems. It’s the only major Layer 3 name trading on a U.S. exchange in a straightforward way. It partners with cable makers on complex subsea installation projects and sports a deep engineering moat.
- Subsea 7 (OSE: SUBC) is a marine installation contractor that owns cable-laying vessels – the true physical chokepoint of the entire industry. You cannot deploy an underwater data center without one. There are very few of them, they take years to build, and their operators charge accordingly.
Note: Prysmian, Nexans, NKT, and Subsea 7 trade on European exchanges (Milan, Paris, Copenhagen, Oslo respectively). They are accessible via international brokerage accounts, and some trade as OTC names in the United States. TechnipFMC (FTI) is the most direct NYSE-listed exposure.
This accessibility gap is part of why these names are underowned by American AI infrastructure investors – and part of why the opportunity exists.
Why Underwater Data Centers Are an Early Stage AI Opportunity
There are really only two things you need to be right about to make serious money in markets.
First, follow the money. Second, find the story that no one is paying attention to yet.
The AI infrastructure buildout checks the first box so decisively that it barely requires argument. Six hundred billion dollars of annual hyperscaler capex. Sovereign AI initiatives from Washington to Riyadh to Tokyo. A global race to build compute capacity faster than the next country, competitor, or model generation. The cash flow is unprecedented, and it’s going straight to the suppliers.
The only question is which suppliers the market has already priced in – and which it hasn’t.
Underwater AI compute checks the second box in a way that very few emerging themes manage. The concept is already operable, as China has proven at commercial scale. The efficiency gains are genuine. The use cases are defensible. And the supply chain is almost entirely invisible to the typical AI infrastructure investor.
That invisibility is the opportunity because markets price what people are paying attention to. When very few people are connecting the offshore oil and gas supply chain to the underwater AI infrastructure buildout, there is real potential for a re-rating when the narrative catches up to reality.
The Bottom Line
By our estimate, we are roughly at AI’s equivalent to cloud computing’s 2012-13 moment: the technology is proven, the first commercial deployments exist, the economics are becoming clear, and the investment community has not yet built a coherent framework for how to play it.
The next 24 to 36 months will be when that framework takes shape – and the early investors in the right names will be well-positioned when it does.
Underwater data centers are a good example of how early this buildout still is.
Capital is still chasing bottlenecks, still finding new frontiers, still expanding the infrastructure layer in ways most investors haven’t fully processed yet.
But that phase doesn’t last forever. Once the infrastructure is in place, the value shifts – quickly – toward the platforms that sit on top of it.
That’s where the next phase of this story plays out. And right now, one company sits at the center of it: OpenAI.
Most investors will only have access after it goes public.
We’ve found a way in earlier.
Watch our full breakdown on this pre-IPO opportunity right here.

