Investment Thesis
The defining voice of Sequoia's AI thesis and arguably the most influential AI investor-thinker in Silicon Valley. Co-authors a series of landmark essays with Pat Grady that have become the canonical frameworks for understanding generative AI's evolution. Core conviction: the application layer — not foundation models — is where enduring value accrues. Foundation models are commoditizing rapidly (only five scaled players remain: Microsoft/OpenAI, Amazon/Anthropic, Google, Meta, xAI), so the real opportunity is in vertical AI applications that solve end-to-end human problems. Believes we have entered the 'Age of Abundance' where AI makes once-scarce labor available everywhere at near-zero cost, transforming the addressable market from software ($1T) to services ($10T+). In this world, 'taste' — the human judgment to decide what to build and how — becomes the scarcest resource. Her January 2026 essay declares AGI is functionally here in the form of long-horizon agents that can sustain multi-step work, correct errors, and persist toward goals autonomously. Sequoia has deployed roughly $150M into foundation models but over $1.5B into application-layer companies, reflecting a 10:1 bet on applications over infrastructure.
What Excites Them
Founders building AGI-native companies where AI agents function as autonomous colleagues, not assistants. Products that solve problems end-to-end from the customer back (not technology-out demos). Companies riding the Age of Abundance — making previously expensive professional services (legal, medical, financial) accessible to everyone at near-zero marginal cost. Small, agile teams that use AI to compete at the scale of legacy enterprises. Vertical AI with deep domain expertise and proprietary data moats (company-specific knowledge graphs, embedded workflows).
What They Pass On
AI wrappers without technical depth or defensibility. Companies where AI is a feature bolted on, not the core architecture. Products competing purely on model quality without application-layer differentiation. 'AI-washed' pitch decks — adding an AI slide without genuine AI-native architecture. Lightweight novelty apps that demonstrate cool technology but lack retention and daily-active-user engagement. Companies that cannot articulate why they win in a world where foundation models are commoditized.
How to Pitch
Frame your company within her published frameworks — reference the specific Act you believe your company exemplifies. Show you are building from the customer-back (Act Two), not technology-out (Act One). Demonstrate why your product is AGI-native, not just ML-enhanced. If you are building a vertical agent, explain your end-to-end workflow, your data moat, and your RL/synthetic data training approach. Show retention metrics and daily engagement, not just signups. She is deeply analytical (Princeton economics, Goldman, TPG) — come with data, a clear thesis about 'why now,' and a view on how you capture value as foundation models commoditize. Reference the Age of Abundance framing: show how your company makes previously expensive services accessible at near-zero marginal cost. Do not AI-wash your deck — she and her co-investors see through it immediately. Small team leveraging AI tools effectively is a positive signal.
Notable Writing
Act 1 came from the technology-out — foundation models as a new hammer generating a wave of novelty apps. Act 2 must come from the customer-back, solving real human problems end-to-end. The biggest challenge is not finding use cases but proving lasting value — retention, not novelty. Invoked Amara's Law: we overestimate technology in the short run and underestimate it in the long run. As foundation models commoditize, the real value shifts to the application layer (product, UX, workflows). Published alongside the V3 generative AI market map.
Introduced the generative AI market map and thesis. Argued that a new class of AI had emerged that creates rather than merely analyzes — shifting the marginal cost of creation toward zero. Every industry requiring original human work (coding, design, law, marketing, gaming) is up for reinvention. Outlined four waves of development from transformer breakthroughs to killer app emergence. Became the foundational reference document for the entire generative AI ecosystem.
AI is progressing from 'thinking fast' (rapid pre-trained pattern matching, System 1) to 'thinking slow' (deliberate reasoning at inference time, System 2). The reasoning layer — inspired by AlphaGo-style approaches — endows AI with problem-solving that goes beyond pattern recognition. This enables 'service-as-a-software,' where the addressable market expands from the $1T software market to the $10T+ services market. Foundation layer has stabilized to five scaled players. Reasoning models are strong on logic-proximate domains (coding, math, science) but still developing on open-ended domains (writing, strategy).
Identified a massive gap between the revenue expectations implied by the AI infrastructure build-out (projected from NVIDIA's data center revenue run rate) and actual revenue growth in the AI ecosystem. The AI industry needs to generate $600 billion annually in revenue to justify current hardware spending levels — raising the question of whether this is sustainable or a bubble.
Declares that AGI is functionally here. Long-horizon agents — AI systems that can sustain multi-step work, correct their own errors, and persist toward goals autonomously — are the realization of AGI. Coding agents are the first proof point. One litmus test: can you hire an agent? In 2023-2024 AI apps were chatbots; in 2026-2027 they will be doers that feel like colleagues. Usage shifts from a few queries per day to all-day, every-day autonomous work. Human roles shift from executor to manager of AI teams.
If 2024 was the 'primordial soup' year for AI, the building blocks are now firmly in place. Data centers are the new rails of the digital economy and will be securely built by end of 2025. Five finalists emerged from the big model race (Microsoft/OpenAI, Amazon/Anthropic, Google, Meta, xAI). AI search will proliferate (Perplexity hit 10M MAU). The key question shifts from 'can we build it?' to 'what freight will ride on those rails?'
AI has reached its 'synesthesia moment' — models that natively understand and generate across modalities (text, image, code, video, audio, voice) in a unified latent space. AI synesthesia converts strengths in one cognitive domain into capabilities in another: if you write well but cannot code, AI bridges the gap through semantic representations; if you design beautifully but struggle to pitch verbally, AI transforms sketches into narratives. Creativity becomes translation, expression becomes multidimensional, and intelligence becomes fluid.
Podcast Appearances
No data yet
Key Quotes
“We're entering the Age of Abundance — where AI makes once-scarce labor available everywhere at near-zero cost.”
— AI Ascent 2025, May 2025
“The application layer is where value finally comes together.”
— AI Ascent 2025, May 2025
“Coding has reached screaming product-market fit.”
— AI Ascent 2025, referencing Cursor's trajectory from zero to $500M ARR in under 18 months
“Long-horizon agents are functionally AGI, and 2026 will be their year.”
— 2026: This Is AGI, Sequoia Capital, January 2026
“In 2023-2024, apps were chatbots. In 2026-2027, they will be doers. They will feel like colleagues.”
— 2026: This Is AGI, Sequoia Capital, January 2026
“One litmus test for AGI is the ability to hire an agent.”
— 2026: This Is AGI, Sequoia Capital, January 2026
“Act 1 came from the technology-out. We discovered a new 'hammer' — foundation models — and unleashed a wave of novelty apps. Act 2 will come from the customer-back and will solve human problems end-to-end.”
— Generative AI's Act Two, Sequoia Capital, September 2023
“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”
— Generative AI's Act Two (invoking Amara's Law), September 2023
“Every industry that requires humans to create original work — from social media to gaming, advertising to architecture, coding to graphic design, product design to law, marketing to sales — is up for reinvention.”
— Generative AI: A Creative New World, Sequoia Capital, September 2022
“The data centers will be built by 2026. The foundation models are largely set. The question is: what gets built on top?”
— AI Ascent 2025, May 2025
“AI synesthesia converts strengths in one cognitive domain into capabilities in another. Creativity becomes translation, expression becomes multidimensional, and intelligence becomes fluid.”
— On AI Synesthesia, Sequoia Capital, April 2025
“We've moved from copilots to autopilots, and now we're entering the era of air traffic control.”
— Panel discussion, 2025
“What will you do when your plans are measured in centuries?”
— On the provocation behind the AGI thesis, 2026
Background
AB in Economics summa cum laude from Princeton University. Started career as an Investment Banking Analyst at Goldman Sachs, then Private Equity Associate at TPG Global. The TPG experience was formative — analyzing large-scale business transformations and understanding how technology adoption drives enterprise value — but she grew frustrated with private equity's reactive approach and 'wanted to look forward' to back companies creating entirely new categories. Recruited by Sequoia Capital in 2018 to join its growth investing practice focusing on enterprise software and data infrastructure. Has since become the intellectual architect of Sequoia's AI strategy. Organizes and hosts Sequoia's annual AI Ascent conference (launched 2023), an invite-only gathering of 150+ top AI founders and researchers. Co-hosts the 'Training Data' podcast with Pat Grady, interviewing leading AI builders and researchers.