Back to Briefings

The Connected Stack: How AI Tools Are Learning to Talk

9 min read

2026 marks a turning point for AI interoperability. The Model Context Protocol has achieved critical mass - 97 million monthly SDK downloads and vendor-neutral governance under the Linux Foundation. AI agents can now work together across tools and platforms. The era of siloed AI assistants is ending.

Early AI tools were islands. Your IDE assistant couldn't talk to your documentation agent. Your code reviewer didn't know about your project management context. Every integration was custom, expensive, and fragile. The Model Context Protocol changes this equation fundamentally.

97M+
monthly SDK downloads
10,000+
active public servers
Protocol Ecosystem

From 100K to 97M in One Year

Protocol server downloads grew from approximately 100,000 in November 2024 to over 8 million by April 2025 - and then exploded to 97 million by year end. That's not gradual adoption. That's an industry coalescing around a standard.

The protocol has been adopted by ChatGPT, Cursor, Gemini, Microsoft Copilot, Visual Studio Code, and virtually every major AI product. Major deployments now span Block, Bloomberg, Amazon, and hundreds of Fortune 500 companies.

People love MCP and we are excited to add support across our products.

Sam Altman
Sam AltmanCEO of OpenAI

In March 2025, OpenAI adopted the protocol across the Agents SDK, Responses API, and ChatGPT desktop. In April, Google DeepMind confirmed support in upcoming Gemini models. At Microsoft Build 2025, Microsoft announced Windows 11 is embracing the standard. This coalescing caused it to evolve from a vendor-led spec into common infrastructure.

Why It Matters

What Interoperability Enables

With a common protocol, agents share a common language. Context flows between tools. The assistant in your editor knows about the ticket in your tracker knows about the discussion in your chat:

  • Context portability - Move between tools without losing state or rebuilding context
  • Agent collaboration - Different agents working on the same task, sharing what they learn
  • Tool integration - Seamless connections to databases, APIs, and services without custom adapters
  • Vendor flexibility - Switch providers without rebuilding integrations

The End of Lock-In

When integrations are standardized, switching costs drop. You can use the best model for each task, the best tool for each workflow, the best vendor for each capability - all working together through a common protocol.

This is good for users and good for innovation. Competition happens on capabilities, not on who has the stickiest integration. The winners are teams that can compose the best solutions from best-in-class components.

Governance

Linux Foundation Governance

In December 2025, Anthropic donated the protocol to the Agentic AI Foundation, a directed fund under the Linux Foundation. The foundation was co-founded by Anthropic, Block, and OpenAI, with support from Google, Microsoft, AWS, Cloudflare, and Bloomberg.

This vendor-neutral governance ensures the standard evolves based on community needs rather than any single company's roadmap. It's the same model that made HTTP, OAuth, and OpenAPI successful - neutral ground where competitors can collaborate on shared infrastructure.

We are now building much of Warp starting with a prompt these days, which is a bit wild since it's over a million lines of Rust with a custom UI framework.

Zach Lloyd
Zach LloydCEO of Warp

Building for Interoperability

Smart teams are designing for interoperability from the start. They expose context through the protocol. They consume context from other tools. They build agents that collaborate rather than compete for attention.

TELUS, for example, credits the protocol as "the most transformative technology to impact TELUS in decades." It enables Claude to connect previously incompatible business systems, eliminating the complex integration work that would have made team-built AI solutions impossible.

  • Expose context - Make your tools available as servers so AI can use them
  • Consume context - Connect to servers to give your agents more capabilities
  • Design for collaboration - Build agents that work with others, not in isolation

Sources & Further Reading

Primary sources and recommended reading cited in this briefing.