What Is MCP? The Model Context Protocol Explained With a Calendar Example
MCP in one sentence
MCP is a standard that lets AI models call your tools — your calendar, your repo, your CRM — through a single shared interface, instead of needing a custom integration for each app.
The protocol was announced by Anthropic in late 2024 and has since been adopted by OpenAI, Google, and basically every AI client people actually use. The full spec lives at modelcontextprotocol.io. You don’t need to read it.
The n×m integration problem MCP solves
Before MCP, every AI client needed its own bespoke plugin for every tool. Claude needed a Google Calendar plugin. Cursor needed one too. ChatGPT had its own walled garden. If you built a tool and wanted three AI clients to use it, you wrote three integrations. If a new client launched, every existing tool had to add support for it. That’s an n×m matrix that grows badly.
MCP collapses it to n+m. A tool builder writes one MCP server. A client builder writes one MCP host. Any client can talk to any tool. People call it the USB-C of AI tooling, which is annoying but accurate — one connector, every device.
Asking Claude about your booking pages (the running example)
Here is what an MCP call actually looks like from the user’s seat. The Carly MCP server exposes a list_bookings tool. When you ask Claude who has booked through one of your pages, this happens:
You: Who has booked my consult page this week?
Claude: (calls
list_bookingswithpage: design-consult, range: 2026-04-26..2026-05-02)Carly MCP server: (returns three bookings)
Claude: Three bookings on the consult page — Paul Evans (Tue 10am), Tal Cohen (Wed 1pm), Sam Isner (Thu 3pm).
You see step one and step four. Claude handles the middle. The tool definition the server hands Claude is roughly this:
{
"name": "list_bookings",
"description": "List confirmed bookings made through a Carly booking page.",
"inputSchema": {
"type": "object",
"properties": {
"page": { "type": "string" },
"range": { "type": "string", "description": "ISO date range" }
},
"required": ["page"]
}
}
That’s it. No SDK, no custom prompt engineering, no glue code on your end. The model reads the description, decides when the tool is relevant, picks arguments, and calls it. The Carly MCP launch post walks through the full setup in three commands — install the CLI, log in, register the server with your client.
How MCP differs from REST APIs and ChatGPT plugins
The three things people confuse with MCP:
- REST APIs are app-to-app. They assume a developer is reading the docs and writing client code. An LLM can technically call a REST endpoint, but it has no standard way to discover what tools exist, what arguments they take, or what the response means without you writing custom scaffolding per API.
- ChatGPT Plugins (deprecated March 2024) were OpenAPI-based, ChatGPT-only, and required listing in OpenAI’s directory. Closed ecosystem, single client.
- MCP is an open standard, designed for LLMs from day one. Any client that speaks MCP can use any server that speaks MCP. Tool descriptions, schemas, and error semantics are built for models to consume directly.
The practical difference: an MCP server you write today works in Claude Desktop, Cursor, and ChatGPT tomorrow with zero changes.
What you can do with MCP today
A non-exhaustive list of things people are actually doing with MCP servers right now:
- Manage your booking pages with Carly MCP — spin up new booking links, change their availability rules, and pull the list of confirmed bookings.
- Query your codebase with the filesystem MCP server — let Claude read, search, and edit local files.
- Read and write GitHub — open issues, review PRs, ship commits from chat. See the official server registry.
- Search a knowledge base — Notion and Google Drive MCP servers turn your docs into queryable context.
- Run SQL against your database — the Postgres MCP server lets the model write and execute queries against a read-only connection.
Browse the official MCP server registry for the full catalog.
Which AI clients support MCP
The list as of April 2026:
- Claude Desktop and Claude Code — first-class support, Anthropic shipped the protocol.
- Cursor, Windsurf, Zed — native MCP host configs.
- VS Code — via Continue or Cline extensions.
- ChatGPT — Developer Mode and the Apps SDK, shipped in late 2025.
Setup looks slightly different in each one, but the server config is the same JSON block everywhere — a command, some args, and optional environment variables. Step-by-step install lives in how to install MCP. ChatGPT-specific setup is in ChatGPT MCP.
Where to go next
- Install your first MCP server — start with Carly MCP if you want a five-minute win, or browse how to install MCP for general setup.
- Browse the registry — github.com/modelcontextprotocol/servers lists the official and community servers.
- Read the spec — modelcontextprotocol.io if you want to write your own server.
Ready to automate your busywork?
Carly schedules, researches, and briefs you—so you can focus on what matters.
Get Carly Today →Or try our Free Group Scheduling Tool or Free Booking Page


