Theme Tracker · Live Feed

AI Phase 2 — From Infrastructure to Implementation

Daily refresh · cron 0 8 * * *
Updated
Section 1
Daily Overview
Updated —
Today's Read
Loading the read…
Loading signals…
Today's #1 Watch
Pulling from news queue…
Basket rebased vs SPY · 180d
Top News · Latest— stories
loading…
Exit Triggers · State
—/—
— watching · — fired
→ See Risks
Watch List · Up Nextnext 30d
loading…
Today's Reading Loading… Interpretive frame
Section 2
Thesis
Stage ·
Core Claim
GPU scarcity is over — lead times collapsed from 52 weeks to 16. Hyperscalers spent $524B on AI infra and recognized $207B in AI revenue, leaving a $317B delivery gap. The trade rotates from "who gets GPUs" (Phase 1, consensus) to "who turns GPUs into revenue" (Phase 2, contrarian). Long the implementation layer (PLTR, NOW, DDOG, SNOW, CRWD, ESTC, VEEV) vs short crowded semis (SMH) and AI-disrupted services (ACN, ZM). YTD the spread is underwater but the thesis is not invalidated — Q1 2026 earnings are the first proof-of-work moment.
Current stage: Emergent → Expanding.
Emergent
Expanding
Crowded
Saturated
Pillar 1 · Demand Anchor
$524B
Hyperscaler 2025 AI capex commitment — the spend side. Meta, Microsoft, Alphabet, Amazon combined. A floor the implementation layer sits on.
Anchored
Pillar 2 · Delivery Gap
$317B
Spend minus recognized AI revenue. This is the addressable pool for implementation vendors who convert compute into software outcomes.
The Prize
Pillar 3 · Lead Time
52w → 16w
GPU lead-time collapse ends Phase 1. Supply arrived; now enterprises have to prove they can use it. Builders lose pricing power to deployers.
Phase 1 ending
Core Tension · Builder vs Spender
CAPEX $524BBUILDERS
REVENUE $207BDEPLOYERS
Basket vs SPY · rebased to 100
Basket (long − short) SPY
Basket · — core · — extended
reviewed quarterly · benchmark SPY
Section 3
Value-Chain Teardown
4 layers · implementation over infrastructure
4 LAYERS The cash flow is moving downstream.Infra → platform → workflow → vertical — payoff compresses to layers 3 and 4. Expand
L1 · Infrastructure Phase 1 Short: SMH Crowded consensus — priced for perfection, losing catalysts.

Why this matters

H100 lead time compressed from 52 weeks to 16 weeks — the bottleneck that justified the 2023-2025 GPU melt-up is gone. Every marginal GPU now has to earn its cost of capital in a workload, not in a backlog.

Thesis link

Pillar 3 — lead time. End of Phase 1.

What to watch

NVDA DC guide vs street · hyperscaler capex growth rate · SMH breadth.

Exit trigger

References trigger #2 — NVDA beats by >15% for 2 consecutive quarters means Phase 1 extends and we're early.

L2 · Data & Platform Phase 2 bridge Long: SNOW · ESTC Useful but commoditizing — platform moat narrow.

Why this matters

Cortex (SNOW) and ESRE (ESTC) monetize the RAG layer — the bridge between compute and workflow. This is where the cheapest unit of "AI revenue" gets recognized, but also the easiest to price-compete.

Thesis link

Pillar 2 — delivery gap, mid-capture.

What to watch

Cortex consumption · ESRE attach · NRR stability above 120%.

Exit trigger

If SaaS NRR avg drops below 100% (cross-reference risk #6 on White-Collar tracker) — Phase 2 starts losing software gravity.

L3 · Workflow & Application Core payoff Long: PLTR · NOW · DDOG · CRWD Where compute becomes revenue. Highest thesis conviction.

Why this matters

PLTR AIP turns the $317B delivery gap into contracted software backlog. NOW Now Assist auto-resolves tier-1 tickets and shows up as margin expansion. DDOG's LLM observability grows 3× core — every AI workload needs monitoring. CRWD's Charlotte is the template for vertical agents.

Thesis link

Pillar 2 — delivery gap captured. This is the P&L conversion layer.

What to watch

PLTR commercial NRR · NOW seats-per-customer · DDOG LLM-obs ARR · CRWD module attach.

Exit trigger

References trigger #3 (PLTR NRR <130% TTM) and trigger #5 (DDOG AI cohort net-adds <30% YoY).

L4 · Vertical AI Narrow but durable Long: VEEV · Short: ACN · ZM Proprietary data wins; billable hours lose.

Why this matters

VEEV's pharma data moat is the case study for durable vertical AI. ACN is the mirror image — a billable-hour model that Gen-AI compresses structurally. ZM is a low-moat SaaS trapped by GPT-wrappers. The long/short pair captures the whole direction of travel for services.

Thesis link

Pillar 2 — end state of the delivery gap. Winners have proprietary workflow data; losers sold labor.

What to watch

VEEV vault-AI attach · ACN managed-services mix · ZM contact-center attach.

Exit trigger

References trigger #4 — if ACN raises FY guide twice in 12m, cover Short side.

Section 4
Risks & Invalidation
— exit triggers · ANY FIRING → RE-EXAMINE THESIS · NOT AN AUTO-CUT
Rule of use Any single trigger firing is a prompt to re-examine the thesis — not an auto-cut. Triggers with a shared driver (e.g. two capex-related events) require corroboration before position change. Metric-driven triggers use filing line items, not market prices we picked.
Watchlist · — KPIs feed the exit triggers above latest · prior · prior−2
KPI Ticker Basis Latest Prior Prior−2 Alert threshold Feeds #
loading…