Build Rich-Context AI Apps with Anthropic: Mastering MCP

Date: May 17, 2025 • Estimated Reading Time: 15 min • Author: Kargi Chauhan

☕️ Today I’m sipping: Cortado

I was going through this lecture and jotting down my notes. I figured, why not share my learning journey publicly? While I won't cover every detail here, I’ll provide clear, concise explanations to help you understand the core concepts. I will note in my to-dos to make it more detailed next time.

Table of Contents

Introduction

Imagine you have a super-smart robot friend that can do many tasks: summarize a long story, fetch the weather, or send a customized email. But to ask your robot friend to do these, you need a simple way to say “Hey, do this!”—that’s exactly what MCP (Model Context Protocol) provides.

An Everyday Analogy

Think of MCP as the rules for ordering pizza:

Quick Mind Map

flowchart LR subgraph Client A[Init] --> B[Invoke Tools/Read Resources] B --> C[Handle Notifications & Returns] end subgraph Server D[Receive via stdio/HTTP] --> E[Dispatch to Primitives] E --> F[Tools] E --> G[Resources] E --> H[Prompts] F --> I[Execute Logic] G --> I H --> I I --> J[Send Return/Notify] end C -->|stdio| D C -->|HTTP+SSE| D C -->|Streamable HTTP| D

This diagram shows how your client and server pass messages over different transports, invoking tools, resources, and prompts in a simple loop.

What Is MCP?

MCP defines what your robot friend can do (the Menu Items, Spice Rack, and Recipe Cards) and how to call them (the protocol). It’s language-agnostic and works whether your robot lives on your computer, in the cloud, or both!

Key Building Blocks

In code, these building blocks are declared with Python decorators:

from anthropic_mcp import MCPServer, tool, resource, prompt

server = MCPServer()

@tool(name="summarize")
def summarize(text: str) -> str:
    "Return a concise summary of the input text."
    ...

@resource(name="stock_price")
def stock_price(symbol: str) -> float:
    "Fetch the latest price for a given ticker."
    ...

@prompt(name="email_template")
def email_template(name: str) -> str:
    "Generate a personalized email greeting."
    ...

server.serve()

How They Talk

Every message follows a simple JSON format:

{
  "id": "1234",
  "kind": "invoke",     // ask to do something
  "tool": "summarize",
  "input": { "text": "Once upon a time..." }
}
// later...
{ "kind": "return", "output": { "summary": "A fairy tale." } }

The Roads They Travel

Your requests can ride different roads:

A Step-by-Step Journey

1. Init: handshake and learn the menu.
2. Invoke: order your pizza (call get_price for “AAPL”).
3. Notify: hear “We’re slicing the dough!” (intermediate logs).
4. Return: pizza arrives (final JSON response).
5. Prompt: use that pizza in your email recipe card for a delightful demo.

Deploying in the Real World

Conclusion

MCP empowers you to craft robust AI apps by combining LLM intelligence with real-world tooling. From schema discovery to error handling, security, and scaling, you now have the full recipe to build, deploy, and maintain conversational AI pipelines at scale.

Citation

Cite this post as:

@article{kargi2025mcp,
  title   = "Build Rich-Context AI Apps with Anthropic: Mastering MCP",
  author  = "Kargi Chauhan",
  journal = "kargisnotebook.github.io",
  year    = "2025",
  month   = "May",
  url     = "https://kargichauhan.github.io/posts/mcp-anthropic.html"
}

References

  1. DeepLearning.AI & Anthropic, “MCP: Build Rich-Context AI Apps with Anthropic” course videos.
  2. Anthropic MCP SDK documentation: github.com/anthropic/mcp-sdk
  3. Server-Sent Events spec: WHATWG
  4. JSON Schema documentation: json-schema.org