☕️ Today I’m sipping: Cortado
I was going through this lecture and jotting down my notes. I figured, why not share my learning journey publicly? While I won't cover every detail here, I’ll provide clear, concise explanations to help you understand the core concepts. I will note in my to-dos to make it more detailed next time.
Imagine you have a super-smart robot friend that can do many tasks: summarize a long story, fetch the weather, or send a customized email. But to ask your robot friend to do these, you need a simple way to say “Hey, do this!”—that’s exactly what MCP (Model Context Protocol) provides.
Think of MCP as the rules for ordering pizza:
translate(text, lang)
to translate text.This diagram shows how your client and server pass messages over different transports, invoking tools, resources, and prompts in a simple loop.
MCP defines what your robot friend can do (the Menu Items, Spice Rack, and Recipe Cards) and how to call them (the protocol). It’s language-agnostic and works whether your robot lives on your computer, in the cloud, or both!
In code, these building blocks are declared with Python decorators:
from anthropic_mcp import MCPServer, tool, resource, prompt
server = MCPServer()
@tool(name="summarize")
def summarize(text: str) -> str:
"Return a concise summary of the input text."
...
@resource(name="stock_price")
def stock_price(symbol: str) -> float:
"Fetch the latest price for a given ticker."
...
@prompt(name="email_template")
def email_template(name: str) -> str:
"Generate a personalized email greeting."
...
server.serve()
Every message follows a simple JSON format:
{
"id": "1234",
"kind": "invoke", // ask to do something
"tool": "summarize",
"input": { "text": "Once upon a time..." }
}
// later...
{ "kind": "return", "output": { "summary": "A fairy tale." } }
Your requests can ride different roads:
1. Init: handshake and learn the menu.
2. Invoke: order your pizza (call get_price
for “AAPL”).
3. Notify: hear “We’re slicing the dough!” (intermediate logs).
4. Return: pizza arrives (final JSON response).
5. Prompt: use that pizza in your email recipe card for a delightful demo.
MCP empowers you to craft robust AI apps by combining LLM intelligence with real-world tooling. From schema discovery to error handling, security, and scaling, you now have the full recipe to build, deploy, and maintain conversational AI pipelines at scale.
Cite this post as:
@article{kargi2025mcp, title = "Build Rich-Context AI Apps with Anthropic: Mastering MCP", author = "Kargi Chauhan", journal = "kargisnotebook.github.io", year = "2025", month = "May", url = "https://kargichauhan.github.io/posts/mcp-anthropic.html" }