Model Context Protocol (MCP)

A standard, introduced by Anthropic, for connecting LLMs to external tools, services, and data sources. The pitch in one sentence: MCP is to LLMs what REST APIs are to web services — a shared language so the same agent can talk to any compliant tool without bespoke glue.

This page is the canonical concept page for the protocol that’s referenced across the wiki. For a full non-technical explainer, see Ras Mic on the Greg Isenberg podcast.

Why It Exists

LLMs by themselves are very limited — they predict the next token, full stop. The first evolution gave them tools (web search, function calls, custom APIs), which made them genuinely useful but introduced a scaling problem: every tool speaks a different “language,” and stacking many tools together is brittle. Engineers spend most of their time gluing tools to LLMs and handling the edge cases when a tool’s API changes underneath them.

“Combining these tools, making it work with the LLM is one thing. But then stacking these tools on top of each other, making it cohesive, making it work together — is a nightmare itself. And this is where we’re currently at.” — Ras Mic

MCP standardizes the integration layer so any MCP client can talk to any MCP server using the same protocol.

The Four Components

┌─────────────┐    ┌──────────┐    ┌─────────────┐    ┌─────────┐
│  MCP Client │◄──►│ Protocol │◄──►│  MCP Server │◄──►│ Service │
│  (LLM-side) │    │ (shared) │    │ (svc-side)  │    │  (DB/   │
└─────────────┘    └──────────┘    └─────────────┘    │  API/   │
                                                       │  CLI)   │
                                                       └─────────┘
ComponentWhat it isExamples
MCP clientThe LLM-facing appclaude-code, cursor, augment-agent, Tempo, Windsurf
ProtocolTwo-way connection between client and serverThe MCP spec itself
MCP serverTranslates a service’s capabilities into the protocolBuilt by the service provider; Supabase MCP, Brave Search MCP, filesystem MCP, git MCP
ServiceThe thing being exposedA database, web search, a CLI tool, a SaaS API

Anthropic’s 3D Chess

The architectural choice that makes MCP a strategic win for Anthropic: the MCP server lives with the service provider, not the LLM vendor. If you want LLMs to have access to your database, you build the MCP server. Anthropic effectively externalized the integration cost across the entire ecosystem — and in doing so, made every compliant LLM client immediately more capable as new MCP servers ship.

This is why every major service provider has been racing to publish MCP servers — Supabase, Figma, Cloudflare, GitHub, Postgres, Slack, and hundreds more.

What MCP Is Not

  • Not Einstein’s fifth law of physics. It’s a standard, period — no novel ML or capability inside the protocol itself.
  • Not the only standard. OpenAI or another vendor could ship a competing standard tomorrow.
  • Not friction-free yet. Setting up MCP servers in clients still involves downloading files, editing JSON config, and other manual steps — the “polish” is incomplete.

Concrete Patterns Across the Wiki

  • Three core MCP servers (Cole Medin’s pattern): filesystem, Brave Search, git — give any AI IDE the file reach, web reach, and version control reach to operate seriously.
  • MCP as growth hack (Nate B Jones): publishing an MCP server is the new “support our API” — gets your tool inside every agent that exists.
  • Memory via MCP (OpenBrain): treat a Postgres database as an MCP server and you have persistent agent memory for ~$0.10/month.
  • Replacing OpenClaw (loop + OpenBrain + MCP tools): the three-Lego-brick model says you can replicate most OpenClaw capabilities by combining MCP tools with native scheduling and memory primitives.
  • Standalone MCP servers as products: e.g. Cole Medin’s Supabase MCP server demo in his AI coding video.

Startup Opportunities (Per Ras Mic)

  • MCP App Store — central place to browse and one-click-deploy MCP servers (Ras Mic claims he registered the domain but hasn’t built it; “please steal this idea”)
  • Standards observation play for non-technical builders — wait for the standard to finalize, then ship integrations on the rails of the winning protocol

See Also