AI Research

The Scarcity Thesis

The AI revolution is moving faster than anyone can process. We've been navigating it in real time since June 2023 — publishing every call, every move, every mistake. Here's the framework that's driven it all.
The Problem
The speed of change is overwhelming

On a single day in February 2026, three frontier AI models launched simultaneously. One of them can manage entire workflows autonomously. Another helped build itself. A third went viral as a free, open-source personal AI agent that people ran on their laptops around the clock. Stocks moved 10% in hours. Business models that had been stable for a decade were suddenly under threat.

That was one day.

In the weeks since: Nvidia posted a $68 billion quarter. Google doubled its capex plan to $185 billion. Amazon committed $200 billion. A joint Tesla-xAI project launched that can "emulate the function of entire companies." A U.S. military campaign validated a defense AI platform in real time. Elon Musk said the economy will be 10 times bigger in ten years.

The volume of information is enormous. The pace is accelerating. The second-order effects cascade in ways that are almost impossible to track. And every day, the market reprices — sometimes violently — as new developments land.

Most investors are trying to process this in real time, on their own, with general-purpose news sources that don't understand the connections between a chip testing company, a fiber optic manufacturer, and a nuclear power plant. The AI revolution is an interconnected system of bottlenecks — and unless you're tracking all of them simultaneously, you're going to miss the moves that matter.

The Framework
If AI creates abundance, invest in scarcity

We launched the AI-Innovation Portfolio in June 2023 with a thesis: generative AI would trigger a trillion-dollar retooling of the world's computing infrastructure. We wrote in our first note: "Prepare for the era of more multi-trillion-dollar companies."

Over nearly three years, the thesis has evolved through three distinct phases. First, the picks and shovels phase — the companies building the hardware to power AI. Then the always-on inference phase — where AI models running 24/7 made computing demand continuous. And now the autonomous agent phase — where AI crossed from thinking to acting, and the demand curve for physical infrastructure went vertical.

Through each phase, one principle has held: the scarce resources that constrain AI output are the most valuable assets in the economy. You cannot code a new copper deposit. You cannot download the inputs for a new data center. You cannot 3D-print a power plant. As the cost of intelligence approaches zero, the physical bottlenecks that remain become the highest-value chokepoints in the system.

This is the framework. The portfolio is how we implement it — and the implementation has to change as the revolution evolves. Knowing when to exit a winner, when to rotate from software to hard assets, when a business model has been disrupted by the very technology you're investing in — that requires being in it every single day.

28
Exited Campaigns
+39.2%
Avg Gain
79%
Win Rate
Jun 2023
Inception
The Bottleneck Map
The AI stack has eight layers — and a chokepoint in each
Chips
The silicon foundation no one can replicate
Every AI model runs on advanced semiconductors. The supply chain that produces them has structural monopolies — single companies that control critical steps in the process. One company makes the only machines capable of printing frontier chips. Another provides the design software that architects them. These aren't competitive advantages that can be eroded. They're physics-constrained monopolies that would take a decade and tens of billions of dollars to replicate — if they could be replicated at all.
Power
The constraint that limits everything else
AI data centers consume orders of magnitude more electricity than traditional facilities. Every autonomous agent running around the clock is an inference engine that never sleeps — and every cycle draws power. The AI companies have committed over $650 billion in capex for the year, and a massive share of it is going toward power generation and grid infrastructure. The bottleneck isn't demand for AI. It's the ability to physically generate and deliver the electricity to run it.
Connectivity
If the GPU cluster doesn't connect, the AI doesn't think
Inside every AI data center, thousands of GPUs must communicate at speeds that push the limits of physics. The fiber, the switches, the cables with silicon embedded in the connectors — these are the nervous system of the AI buildout. One connectivity company just signed a $6 billion contract with a single hyperscaler. Another is seeing AI-related revenue grow 110% year-over-year. The demand for bandwidth is growing faster than the ability to install it.
Compute
The cloud layer that can't build fast enough
The hyperscalers and cloud providers are monetizing AI compute capacity the moment it comes online. The signal we've tracked since inception: are the people fulfilling the demand for AI still acting like they don't have enough supply? Nearly three years in, the answer is still yes — and it's getting more emphatic. One cloud provider is building data center capacity at a pace that projects 65% revenue growth. The others have committed a combined $385 billion+ in capex.
Security & Governance
When agents go autonomous, guardrails become mandatory
150,000 people are now running autonomous AI agents on personal computers — with access to their files, browsers, and systems. Enterprise companies are deploying agent teams with access to real infrastructure. The attack surface expanded exponentially. And somebody has to be the air traffic controller that prevents an autonomous agent from executing an unverified action. A real-world military operation in early 2026 validated one of these platforms in the most high-stakes environment imaginable.
Data Layer
The intelligence substrate beneath every AI application
Every AI model needs data to train on and data to act on. Every inference cycle generates new data that needs to be stored, queried, and monitored. The database architecture, the monitoring layer, and the warehousing infrastructure form the substrate beneath every AI workload in production.
Frontier
The bridge from digital agents to physical agents
The autonomous agents are digital — operating computers, running workflows, generating output around the clock. The physical agents — humanoid robots, surgical systems, quantum processors — are on a production timeline. When both are running at scale, the economy is producing more output, around the clock, without adding headcount. That's the "limitless economy." One company is building the bridge between the two — and mass production of its humanoid robot starts this summer.
Financial Infrastructure
When agents transact, the rails must be digital
AI agents operating around the clock are making decisions that involve money — paying for compute, executing financial models, settling contracts. The financial infrastructure that supports human-paced, business-hours transactions wasn't built for an economy where agents transact continuously, globally, and autonomously.

The portfolio currently holds 20+ positions across all eight layers of the AI stack — each with a specific thesis, entry price, and catalyst timeline. Members get every position, every research note, and every exit alert.

See the Full Portfolio →
"In a limitless economy, the companies that own the scarce physical resources that power all of that limitless output capture the most asymmetric share of the growth."
— Bryan Rich, AI-Innovation Portfolio research, February 2026
The Track Record
We've been calling the turns since day one
June 2023
"Prepare for the era of multi-trillion-dollar companies"
We launched the portfolio with a thesis: generative AI would trigger a trillion-dollar retooling of the world's computing infrastructure. We started with the foundational chip companies and built out across the AI stack — lithography, design software, networking, simulation. We identified the structural monopolies that could not be replicated and built positions in each one.
Late 2023 – Mid 2024
Enterprise adoption phase
We added the enterprise software platforms that would be the fastest path for AI to spread inside Corporate America — plus the cloud infrastructure and hyperscale compute layer. The portfolio grew to nearly 20 positions spanning the full AI stack. These early platform plays returned an average of 40% — profits that would later fund our rotation.
Mid 2024
The "ultimate AI stock"
We added a company the market was still valuing as a car manufacturer — but we saw as the bridge between digital and physical AI. The thesis: mass adoption of humanoid robots would make the economy "limitless." At the time, the market hadn't priced in what was actually being built behind the core product.
Early 2025
The scarcity rotation begins
We rebalanced the portfolio — taking profits, resetting to equal-dollar weight, and building open capacity for new additions. Then we started adding physical infrastructure: power generation, chip testing, robotics, nuclear. The portfolio began shifting from software toward hard assets. This was months before the market began making the same rotation.
May 2025
Taking profit at the right time
We exited two of the portfolio's biggest winners — one at a 6x return from our entry. Both were re-entered later at better prices for different reasons. Knowing when to sell is as important as knowing what to buy, and it's harder.
February 5, 2026
"AI crossed from thinking to acting"
On a single day, three frontier models launched and the market delivered a verdict: the stocks that rallied were physical infrastructure. The stocks that sold off were enterprise software. We made our biggest portfolio adjustment in a year — four exits and four additions — rotating fully toward scarcity. We exited three enterprise software positions (the per-seat SaaS model was now working against them) and one model-adjacent position at +85%. We added security, governance, and physical connectivity.
March 2026
"Compute equals revenue"
The thesis is confirmed in real time. The most important company in the AI revolution posted a $68 billion quarter and said every dollar of compute capacity added is being monetized the moment it comes online. New AI models can now operate entire computers autonomously. A joint venture between an automaker and an AI lab launched a system capable of "emulating the function of entire companies." Mass production of the first humanoid robot designed for scale starts this summer. The scarcity thesis isn't a prediction anymore. It's the operating reality.
Go Deeper
The forces shaping the AI revolution

Each of these pages explores a critical dimension of the AI buildout — the questions driving the investment landscape, the forces at work, and the framework we use to navigate them.

The framework is free. The portfolio is not.

We publish the thinking so you can evaluate whether it resonates. The positions, the entry prices, the exit alerts, and the daily research that keeps it all current — that's the AI-Innovation Portfolio. 28 exited campaigns at +39.2% average gain since June 2023.

Subscribe — $297/quarter →