1. Snapshot and thesis
Browser wars are happening again! This time with AI fueling the frenzy. AI browsers blend a familiar web browser with built-in AI that can summarize pages, answer questions, auto-navigate, draft content, and increasingly act for the user.
Over the next 24 months, this category will test whether AI-first workflows can carve out share in an entrenched browser market. And whether they can monetize beyond traditional search ads and create new attachments across the cloud and edge stack.
Core thesis: The investable opportunity is attractive if (a) AI browsers can win distribution on mobile and desktop despite defaults (b) they can lower inference costs using in-browser compute and edge inference (c) they can convert intent (and time saved) into measurable revenue (ads, subscriptions, affiliate, or payments).
The upside expands because this category sits on top of and helps drive demand for a number of infra layers: GPUs, WebGPU / ONNX Runtime Web / TensorFlow.js, model APIs (OpenAI, Anthropic, Google), edge inference platforms (Cloudflare, Akamai), and search indexes (Brave Search, Bing, Perplexity’s own).
Why now: Two enabling changes have landed: (1) web-native acceleration (WebGPU, maturing WASM toolchains, early WebNN), so models can run in the browser (2) policy/UX shocks that open distribution (EU DMA forcing Apple to allow alternative engines in the EU, Google’s AI Overviews changing search result pages and traffic patterns).
2. What counts as an “AI browser” and who are the players
In this report, “AI browsers” will include any web browser or browser-like app where AI is a native control surface (not just an extension). That spans:
Established browsers adding AI:
Google Chrome: AI Overviews in search; “Help me write,” “Tab Organizer”.
Microsoft Edge: Copilot built-in
Opera: Aria assistant via Composer, on desktop, Android, and Opera Mini. Neon, an AI-native browser slated for public rollout. MiniPay crypto wallet integrated.
Brave: Leo assistant, independent Brave Search, opt-in Brave Ads with 70% revenue share to users
Mozilla Firefox: Exploring on-device and private AI assistants. Firefox remains a distribution gate even as AI integrations evolve. Context on iOS engine policy below.AI-native challengers:
Perplexity: Comet AI browser. Talks with OEMs for preinstalls. Consumer MAU and revenue run-rate scaled markedly in 2025.
BrowserBase: A web browser for AI agents. They recently teamed up with Cloudflare to build an identity layer for AI agents.
Anthropic: They recently launched Claude for Chrome.
The Browser Company (Arc): AI-heavy “Browse for me” style features and agentic flows.
SigmaOS (workflow browser with AI co-pilot). Felo/Fellou and Strawberry (smaller AI browser entrants).
You.com: AI search + browser-style app. Enterprise agents and a web search API.
DuckDuckGo: DuckAssist summaries and private AI chat embedded in its app.Regional ecosystems:
Baidu is folding ERNIE models into search and companion apps. Yandex upgraded its Alice assistant with YandexGPT 3 across devices. These ecosystems point to AI browsers that are tightly integrated with local search, messaging, and mini-apps.
3. Market size and trajectory (next 24 months)
Installed base: Browsers are the biggest consumer runtime. Statcounter shows Chrome at ~68%, Safari ~16%, Edge ~5%, with others (Brave, Opera, Firefox, Samsung Internet) making up the balance (12-month window ending July 2025). That reach is applied to a global internet user base of ~5.5–5.65 billion in 2024–2025.
Penetration starting point: AI answers already reach users through default surfaces (Google AI Overviews, Bing/Copilot milestones such as 100 million DAU in 2023), while dedicated AI browsers are in early innings with meaningful momentum: Perplexity reported ~30 million MAU and a ~$150 million ARR-like run-rate by mid-2025. Brave discloses ~93.8 million MAU. Opera reported accelerating ad/search growth with AI-powered monetization and an AI-native Neon rollout in 2025.
Sizing the near-term “AI browser” revenue pool. This is not a classic TAM. It’s layers:
Ad/search monetization inside the browser: Sponsored answers, native ad formats, and affiliate/commerce. Google’s AI Overviews shift click-through patterns. Publishers report measurable volatility, which implies spend reallocations toward answer surfaces.
Subscriptions: Perplexity’s premium, Brave Leo Premium, VPN bundles, and potential “agent” fees.
Payments/commerce baked into browsers: Opera MiniPay crossing 8–9 million activated wallets and >200–250 million transactions in 2025.
Back-of-envelope scenario (illustrative): If only 2% of the world’s internet users (~110 million) adopt an AI browser subscription at a blended $4/month in 24 months, that’s ~$5.3 billion annualized. If another 300 million users generate $1/month in incremental ad/affiliate yield via AI answers and shopping flows, that’s ~$3.6 billion annualized. These are assumptions and the purpose is to show order of magnitude. Actual outcomes hinge on distribution and inference cost curves.
Growth drivers:
a. Distribution policy changes (EU DMA, choice screens on iOS 17.4 in the EU, and potential antitrust remedies around default search contracts in the US).
b. Lower latency and cost thanks to WebGPU + ONNX Runtime Web/TensorFlow.js and edge inference (Cloudflare, Akamai).
c. New OEM channels for AI browsers on mobile (Perplexity’s Comet preinstall talks).
4. Who uses these and why they care
For consumers, the job to be done is “get answers, not links” plus “summarize this page, plan this trip, buy the thing, and do it fast”. AI browsers like Perplexity and Opera Aria collapse the search-read-copy-paste loop. And in Opera Mini, they do it even on low-end Android devices via lightweight UIs.
For prosumers and teams, the pitch is “do more with fewer tabs”. Arc and SigmaOS focus on workspace-style browsing with AI co-pilots to organize and draft. And You.com pushes enterprise agents and a web search API to wire AI into regulated workflows.
For privacy-sensitive users and publishers, Brave’s independent index and Leo assistant try to keep data local and links credited. DuckDuckGo keeps AI optional and private.
5. Product and technology: how the stack fits together
Client-side acceleration. WebGPU (default in Chrome/Edge since 2023–2024) plus WASM allows meaningful on-device inference for vision, speech, and small-to-mid LLMs. Microsoft’s ONNX Runtime Web added a WebGPU execution provider in Feb 2024. TensorFlow.js continues to expand WebGPU support. Early WebNN bridges to native ML APIs. This reduces cloud spend, improves latency, and eases privacy concerns.
Model options. Opera’s Aria routes to OpenAI and Google through its Composer layer (and can switch models), while Brave’s Leo supports leading closed and open models. Perplexity uses a blend (its own retrieval stack plus partner models) and its Comet browser aims to bring that into the navigation layer.
Open-source tooling (browser-ready).
WebLLM / MLC-LLM (LLMs compiled for WebGPU, in-browser quantization).
Transformers.js (browser-side transformer inference with JS).
llama.cpp (CPU/GPU-friendly inference, with ports to web via WASM/WebGPU).
ONNX Runtime Web and TensorFlow.js (core runtime layers, increasingly WebGPU-accelerated).
Edge inference. Cloudflare’s Workers AI and AI Gateway, and Akamai’s Cloud Inference, push model serving closer to users to cut tail latency and cost. For AI browsers, that shortens round-trips for page-aware actions (summaries, “shopping compare”, “book this”) and creates an infra partnership surface (shared caches, embeddings, and guardrails at the edge).
OS hardware tailwinds (indirect but relevant). Copilot+ PCs ship with NPUs. Apple Intelligence targets A17 Pro and M-series devices. While browsers mainly use the GPU via WebGPU, the hardware trend signals more capable local inference and rising user expectations for on-device AI.
6. Distribution and competition
Defaults still matter. Safari and Chrome together exceed 80% market share worldwide. Moving users off defaults is hard. That’s why Perplexity is negotiating preinstalls and why EU choice screens (iOS 17.4) matter. For private companies, OEM deals and regional bundling can be the difference between niche and mainstream.
Search stacks split the field.
Google: AI Overviews change SERP layouts and upstream supply/demand. The company controls both Chrome and Search.
Microsoft: Edge + Copilot + Bing give an integrated alternative. Bing crossed 100 million DAU when Chat launched in 2023, a psychological threshold that keeps the flywheel turning.
Independents: Brave has its own index (plus Brave Search Ads), a differentiator vs meta-search. Perplexity is building index and retrieval infra and layering an agentic browser on top. BrowserBase is building web browsers to agents+applications.
Regional: Baidu (ERNIE in search), Yandex (Alice/YandexGPT) bundle AI across services.
Notable financial traction and signals.
Perplexity: ~30 million MAU and ~$150 million run-rate as of mid-2025 (press reporting).
BrowserBase: 50 million+ browser sessions in 2025, serves 1,000+ companies, and has 20,000+ developers signed up.
Brave: ~93.8 million MAU (company transparency page, 2025).
Opera: Q2 2025 revenue +30% YoY; ad revenue +44% YoY; MiniPay >9 million activated wallets and >250 million transactions; AI-native Neon moving toward rollout.
7. Monetization, unit economics, and paths to profit
Ad and commerce yield. AI answers can capture high-intent queries (e.g. comparisons, local services, products) and monetize via native ads, affiliate, or merchant lead gen. Google’s AI Overviews already alter click flows. Some publishers report big swings, a signal that spend may reallocate toward answer units where the decision happens. For challengers, the question is whether AI answers can command CPC/CPA pricing comparable to today’s SERP ads at scale.
Subscriptions. Perplexity Plus (and Comet tiers), Brave Leo Premium, and bundles (VPN, talk, search premium) provide diversified ARPU. Subscriptions smooth out the inherently cyclical ad budgets and create a budget for inference.
Revenue sharing and tokens. Brave shares 70% of ad revenue from opt-in Brave Ads with users via BAT. It’s one of the few browsers with a transparent user revenue share. For investors, this is a lever for adoption and loyalty, but it shifts margin from company to user.
Inference cost curve. Cloud-only answer generation is expensive. But two trends are bending the curve:
a. In-browser inference offloads smaller and latency-sensitive tasks (summaries, RAG, speech) using WebGPU and WASM. ONNX Runtime Web’s WebGPU provider launched in 02/2024.
b. Edge inference (Cloudflare, Akamai) trims tail latency and egress while enabling semantic caching. Fastly’s AI Accelerator (semantic caching) illustrates the caching/gateway layer that can sit in front of expensive LLM calls.
Unit economics (directional). If an AI browser session involves 1–2 short model calls (RAG + summary) that can be handled locally or at the edge for pennies and premium tasks go to 4o/Claude/Gemini only when needed, then gross margins can look similar to ad-supported browsers with improved attach on premium. The mix of local/edge/cloud will be the dominant driver of gross margin over the next 24 months.
8. Dependencies, hidden connections, and infra correlations
To GPUs (and GPU-like APIs). WebGPU exposes the device GPU to the web. Everything from WebLLM to ONNX Runtime Web depends on it. That ties AI browser performance to Chrome/Edge release cadence (and to Metal/Vulkan/DirectX under the hood), raising the value of teams that can squeeze performance from shader kernels and quantization.
To model vendors. Opera Aria’s Composer taps OpenAI and Google. Edge integrates Copilot (OpenAI-family models). Perplexity blends its retrieval with partner models. Contract pricing, rate limits, and safety policies at OpenAI, Google, and Anthropic directly affect the UX and gross margin of many AI browsers.
To search indexes. Independence matters. Brave’s own index reduces reliance on Bing. Perplexity is investing in its own web data and partnerships (Comet + OEM). Contract changes at Bing or Google could whipsaw smaller players.
To edge networks. Cloudflare’s Workers AI/Gateway and Akamai’s Cloud Inference make agentic browsing feel instantaneous and cheaper. Expect deeper commercial tie-ups (shared semantic caches, RAG stores at the edge, abuse prevention) between AI browsers and these networks.
To mobile OEMs and app stores. Perplexity’s preinstall talks hint at a classic “default wars” playbook. DMA-driven choice screens on iOS in the EU open a wedge for challengers who design great first-run flows and import tools. These are leverage points for venture-backed challengers.
Correlations to watch.
Ad market health <> AI answer RPMs. If SERP budgets migrate into AI answer units, browsers that own the answer surface will capture outsized upside.
WebGPU maturity <> local inference share. As WebGPU and WebNN mature, more workload moves on-device, improving margins and privacy.
Policy changes <> install base churn. DMA-style changes and U.S. remedies on default contracts could shift share faster than organic marketing can.
9. Risks and how to read them early
Distribution lock-in. Chrome and Safari dominate. Even great products struggle to overcome defaults and habit. Early warning: OEM deals failing to convert, short-lived spikes post-PR, and low stickiness after first-run.
Traffic and publisher backlash. AI answers that hoover demand without sending clicks will risk regulatory and ecosystem pushback. There might be lawsuits, policy proposals, and an uptick in paywalled content blocking crawlers. Publishers have already reported volatility post-AI Overviews.
Safety and compliance. Browsers delivering AI answers at scale must handle hallucinations, defamation, and local content controls. Edge caches can repeat bad answers faster. Human-in-the-loop and retrieval quality become key controls.
Inference cost blowouts. If workloads stay cloud-heavy, COGS can swamp subscription ARPU. Watch ratio of local/edge/cloud calls. Follow releases from ONNX Runtime Web/TensorFlow.js and edge providers that measurably cut $$.
Mobile platform friction. Apple’s EU engine carve-out helps, but outside the EU the WebKit requirement remains. Android OEM deals can be fragile. Track Apple/WebKit changes, iOS adoption of alternative engines (EU-only), and OEM preinstall terms.
Regional complexity. In China and parts of the CIS, local giants (Baidu, Tencent, Yandex) integrate AI into super-apps and search, making it hard for foreign AI browsers to grow. Follow ERNIE, Hunyuan, DeepSeek integrations across search, messaging, and app stores.
10. Company snapshots (what’s differentiated)
Google (Chrome + AI Overviews). Control over the browser and the ad engine with AI Overviews shifting “where decisions get made”. The risk is publisher backlash and regulatory scrutiny if traffic declines persist.
Microsoft (Edge + Copilot). “Assistant in the browser” is clear. Crossing 100 million DAU on Bing in 2023 showed a step-change in engagement once chat arrived. The new Copilot+ PC push raises user expectations for local AI, which can complement WebGPU workloads in Edge.
Opera (Aria, Neon, MiniPay). Clear AI narrative, strong emerging-market exposure via Opera Mini, and a fintech angle through MiniPay (8–9 million wallets, >200–250 million transactions). Q2 2025 had 30% revenue growth, 44% ad growth, and Neon on the horizon. Execution on Neon and MiniPay monetization are the key tells.
Brave (Leo, Search, BAT). A vertically integrated, privacy-centric stack (browser + index + ads) with ~93.8 million MAU and a distinctive user revenue-share model (70% to users for Brave Ads). Watch whether Leo and premium bundles lift ARPU without undercutting ad take.
Perplexity (Comet, OEM). A consumer AI brand moving down-funnel into a browser. MAU and revenue grew fast in 2025. OEM preinstalls could be a distribution unlock. Execution risks are quality at scale, cost containment, and navigating platform politics.
BrowserBase: In June 2025, it closed a $40 million Series B led by Notable Capital. Launched Director (a no-code web-automation product), signaling broader demand beyond developers. The company says it has supported 50 million+ browser sessions in 2025, serves 1,000+ companies, and has 20,000+ developers signed up. 100 million+ usage minutes billed per month, hundreds of paying customers, and “millions in revenue in its first year”, plus a 17% month-over-month increase in active subscribers after a pricing tweak. The Stagehand TypeScript repo shows ~16.7k GitHub stars (with companion repos like the MCP server at ~2.5k and open-operator at ~1.8k).
You.com / DuckDuckGo / Arc / SigmaOS / Strawberry / Felo. These round out the spectrum from enterprise agents (You.com) to private AI answers (DuckAssist) to workflow-first browsers (Arc, SigmaOS) and niche AI browsers. Traction will hinge on a wedge (enterprise compliance, private AI, or a unique workflow) and a durable acquisition channel.
Regional giants (Baidu, Yandex). Tight integration with search, mini-apps, and super-apps (Alice, ERNIE) produces “AI browsers” that are really gateways into national ecosystems. Good defensive moats locally. Tough export story.
11. Ties to the infra layer and how value flows upstream
GPU and driver stacks. Every time an AI browser runs a local summary with WebGPU, that’s incremental demand for GPU-capable devices and the driver/runtime work behind them (DirectX 12, Vulkan, Metal). When more work happens locally, latency shrinks and conversion improves — creating a measurable ROI story that justifies GPU-capable endpoints.
Model APIs and gateways. AI browsers are high-variance demand generators for OpenAI, Anthropic, and Google APIs. Edge gateways (Fastly AI Accelerator, Cloudflare AI Gateway) smooth demand with semantic caching, hydrate RAG stores, and cap spend bursts — a surprisingly material piece of the puzzle for unit economics.
Edge inference/CDN. Akamai’s Cloud Inference (3x throughput, up to 2.5x lower latency claims) is built exactly for the “answer right now” patterns AI browsers need. Expect joint solutions e.g. shared embeddings, abuse/fraud screens.
Search indexes and crawling. Brave’s independent index and Perplexity’s retrieval investments reduce dependence on Bing/Google contracts. If remedies in the U.S. limit default search payments, the browser layer becomes a contested distribution node. This raises the strategic value of owning the index.
Open-source toolchains. WebLLM/MLC-LLM, Transformers.js, llama.cpp, ONNX Runtime Web, and TensorFlow.js are now production-relevant. They compress costs and unlock offline/private modes. Infra vendors that optimize for these (developer tooling, observability, safety filters) will capture spend.
Hidden connection: as AI answers shift intent capture from SERP pages to “in-browser panels”, affiliate and performance marketing networks (and the edge CDNs that carry them) become part of the infra story: rate-limiters, link-resolvers, and fraud filters move closer to the browser. Fastly’s semantic caching is a preview. Expect Cloudflare, Akamai, and even payment processors to offer “AI answer commerce” kits.
12. Regulation and platform dynamics
EU DMA and iOS engines. Apple opened the door to non-WebKit engines in the EU (iOS 17.4), which can shift mobile distribution for AI browsers there. Outside the EU, WebKit remains required, tempering feature parity (e.g. WebGPU capabilities) for iOS users.
U.S. search remedies. Proposed measures target default search contracts and possibly Chrome-search bundling. Outcomes could reshape browser distribution economics. Timing matters for venture pacing.
Publisher relations. As AI answers expand, expect more licensing partnerships (Perplexity’s publisher deals) and more traffic-sharing proposals to defuse ecosystem friction.
13. What to watch in the next 24 months
Catalysts.
Perplexity Comet GA + OEM preinstalls (distribution test).
Opera Neon public rollout and MiniPay monetization progress (ad/search + fintech blend).
Chrome/Edge WebGPU & WebNN releases that materially improve on-device LLM speeds (watch ONNX Runtime Web and TF.js releases).
Google’s AI Overviews dialing (frequency, layout, ad load) and publisher response.
EU/U.S. remedies on defaults and any choice-screen expansions.
Early warning indicators.
COGS/ARPU gaps widening (cloud-heavy inference, no local/edge offload).
Short-session stickiness deteriorating (AI panels opened but not used >2–3 times per day).
Publisher/legal friction rising (blocked crawlers, suits, or higher licensing costs).
What would change the call (positive).
WebGPU/WebNN improvements that enable a standard, fast, small-model local stack across mainstream devices (Chrome, Edge, Safari).
Clear OEM distribution wins (preinstall + retention), validating a non-default route to tens of millions of users.
Proven ad RPMs or conversion data from AI answer units on par with classic SERP ads, with credible attribution.
What would change the call (negative).
Strong legal remedies that limit AI answer units or impose heavy licensing costs per snippet.
Performance ceilings on iOS WebKit that keep AI features lagging, capping mobile growth outside the EU.
A plateau in WebGPU adoption or instability that undermines local inference economics.
14. Where to place venture bets
Applications (consumer and prosumer). Founders with a distribution wedge (OEM, region, or workflow) and a clear cost plan (local + edge + cloud) seem to be in prime position. Perplexity (consumer brand + OEM motion), BrowserBase (web browsers for agents), Arc/SigmaOS (workflow depth), You.com (enterprise agents with a browser-like UX), and privacy-centric stacks (Brave) each represent one of these wedges. The de-risked angle is to fund attachments (research agents, shopping copilots, or vertical modules) that live inside multiple browsers.
Infrastructure products. The biggest returns may accrue to the layers that let AI browsers run cheaply and instantly:
Web runtimes (ONNX Runtime Web, TF.js-compatible optimization services, model-to-WebGPU compilers like MLC).
Edge inference with semantic caching, abuse controls, and per-publisher licensing logic (Cloudflare Workers AI/AIGateway, Akamai Cloud Inference, Fastly AI Accelerator).
Search infra (crawler/index as a service for AI UIs, embeddings storage and freshness pipelines). Brave’s independent path and Perplexity’s moves show the strategic value of index control.
Network products. Invest where answers meet commerce: API gateways that price by semantic similarity, affiliate routers tuned for AI panels, and link-resolver services that run on CDNs. Early evidence: Fastly’s semantic caching and Akamai’s claims on cost/latency improvements.
Open-source leverage. Back maintainers and vendors who package WebLLM/MLC-LLM, llama.cpp, or Transformers.js for enterprise browser deployments with management, policy, and observability (the “Vercel for in-browser AI” slot).
Closing thoughts
AI browsers are credible venture targets if you believe (1) answer-first experiences will capture high-intent moments inside the browser (2) local/edge inference will make those experiences fast and cheap (3) distribution wedges (DMA choice screens, OEM deals, differentiated workflows) can overcome default gravity.
The opportunity is in the apps as well as the “hidden pipes” that let those apps feel instant, safe, and affordable at scale. Over the next 24 months, watch three signals: distribution wins, cost curves, and ad/commerce RPMs. If two of the three break right, this sector can compound. And drag a lot of infra value up with it.
If you are getting value from this newsletter, consider subscribing for free and sharing it with 1 infra-curious friend: