MCP Server
by GobinFan
# Building MCP Server & Client with Python from Scratch
[中文](README.md) | English
## Introduction
MCP Server implements the Model Context Protocol (MCP), providing a standardized interface for AI models to connect with external data sources and tools such as file systems, databases, or APIs.

### Advantages of MCP
Before MCP, AI tool calls were primarily handled through Function Calls, which had several issues:
1. Inconsistent Function Call formats across different LLM providers
2. Inconsistent input/output formats across various API tools, making encapsulation and management cumbersome

MCP acts as a unified USB-C equivalent, standardizing both Function Call formats from different LLM providers and tool encapsulation.
## MCP Transport Protocols
MCP currently supports two main transport protocols:
1. **Stdio Transport Protocol**
- For local usage
- Requires command-line tool installation
- Has specific environment requirements
2. **SSE (Server-Sent Events) Transport Protocol**
- For cloud service deployment
- Implements HTTP long connections
## Project Structure
### MCP Server
- Stdio Transport Protocol (Local)
- SSE Transport Protocol (Remote)
### MCP Client
- Custom Client (Python)
- Cursor
- Cline
## Environment Setup
### 1. Install UV Package
**MacOS/Linux:**
```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```
**Windows:**
```powershell
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
```
### 2. Initialize Project
```bash
# Create project directory
uv init mcp-server
cd mcp-server
# Create and activate virtual environment
uv venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
# Install dependencies
uv add "mcp[cli]" httpx
# Create server implementation file
touch main.py
```
## Building Tool Functions

To enable large language models to access technical documentation of mainstream frameworks, we'll build a tool that searches web pages based on user queries and specific site domains using Google search, then parses and extracts text from relevant pages.
### 1. Build Documentation URL Mapping
```python
docs_urls = {
"langchain": "python.langchain.com/docs",
"llama-index": "docs.llamaindex.ai/en/stable",
"autogen": "microsoft.github.io/autogen/stable",
"agno": "docs.agno.com",
"openai-agents-sdk": "openai.github.io/openai-agents-python",
"mcp-doc": "modelcontextprotocol.io",
"camel-ai": "docs.camel-ai.org",
"crew-ai": "docs.crewai.com"
}
```
### 2. Build MCP Tool
```python
import json
import os
import httpx
from bs4 import BeautifulSoup
from mcp import tool
async def search_web(query: str) -> dict | None:
payload = json.dumps({"q": query, "num": 3})
headers = {
"X-API-KEY": os.getenv("SERPER_API_KEY"),
"Content-Type": "application/json",
}
async with httpx.AsyncClient() as client:
try:
response = await client.post(
SERPER_URL, headers=headers, data=payload, timeout=30.0
)
response.raise_for_status()
return response.json()
except httpx.TimeoutException:
return {"organic": []}
async def fetch_url(url: str):
async with httpx.AsyncClient() as client:
try:
response = await client.get(url, timeout=30.0)
soup = BeautifulSoup(response.text, "html.parser")
text = soup.get_text()
return text
except httpx.TimeoutException:
return "Timeout error"
@tool()
async def get_docs(query: str, library: str):
"""
Search for the latest documentation for a given query and library.
Supports langchain, llama-index, autogen, agno, openai-agents-sdk, mcp-doc, camel-ai, and crew-ai.
Parameters:
query: Query to search for (e.g., "React Agent")
library: Library to search in (e.g., "agno")
Returns:
Text from the documentation
"""
if library not in docs_urls:
raise ValueError(f"Library {library} not supported by this tool")
query = f"site:{docs_urls[library]} {query}"
results = await search_web(query)
if len(results["organic"]) == 0:
return "No results found"
text = ""
for result in results["organic"]:
text += await fetch_url(result["link"])
return text
```
## Building MCP Server (Stdio Protocol)
### 1. MCP Server (Stdio)
```python
# main.py
from mcp.server.fastmcp import FastMCP
from dotenv import load_dotenv
import httpx
import json
import os
from bs4 import BeautifulSoup
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP
from starlette.applications import Starlette
from mcp.server.sse import SseServerTransport
from starlette.requests import Request
from starlette.routing import Mount, Route
from mcp.server import Server
import uvicorn
load_dotenv()
mcp = FastMCP("Agentdocs")
USER_AGENT = "Agentdocs-app/1.0"
SERPER_URL = "https://google.serper.dev/search"
docs_urls = {
"langchain": "python.langchain.com/docs",
"llama-index": "docs.llamaindex.ai/en/stable",
"autogen": "microsoft.github.io/autogen/stable",
"agno": "docs.agno.com",
"openai-agents-sdk": "openai.github.io/openai-agents-python",
"mcp-doc": "modelcontextprotocol.io",
"camel-ai": "docs.camel-ai.org",
"crew-ai": "docs.crewai.com"
}
async def search_web(query: str) -> dict | None:
payload = json.dumps({"q": query, "num": 2})
headers = {
"X-API-KEY": os.getenv("SERPER_API_KEY"),
"Content-Type": "application/json",
}
async with httpx.AsyncClient() as client:
try:
response = await client.post(
SERPER_URL, headers=headers, data=payload, timeout=30.0
)
response.raise_for_status()
return response.json()
except httpx.TimeoutException:
return {"organic": []}
async def fetch_url(url: str):
async with httpx.AsyncClient() as client:
try:
response = await client.get(url, timeout=30.0)
soup = BeautifulSoup(response.text, "html.parser")
text = soup.get_text()
return text
except httpx.TimeoutException:
return "Timeout error"
@mcp.tool()
async def get_docs(query: str, library: str):
"""
搜索给定查询和库的最新文档。
支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai。
参数:
query: 要搜索的查询 (例如 "React Agent")
library: 要搜索的库 (例如 "agno")
返回:
文档中的文本
"""
if library not in docs_urls:
raise ValueError(f"Library {library} not supported by this tool")
query = f"site:{docs_urls[library]} {query}"
results = await search_web(query)
if len(results["organic"]) == 0:
return "No results found"
text = ""
for result in results["organic"]:
text += await fetch_url(result["link"])
return text
if __name__ == "__main__":
mcp.run(transport="stdio")
```
Launch command:
```bash
uv run main.py
```
### 2. Client Configuration
#### 2.1 Using Cline
First, install the Cline plugin in Visual Studio Code, then configure MCP:
```json
{
"mcpServers": {
"mcp-server": {
"command": "uv",
"args": [
"--directory",
"<your-project-path>",
"run",
"main.py"
]
}
}
}
```

#### 2.2 Using Cursor

Create a .cursor folder in the project root directory and create an mcp.json file with:
```json
{
"mcpServers": {
"mcp-server": {
"command": "uv",
"args": [
"--directory",
"<your-project-path>",
"run",
"main.py"
]
}
}
}
```
Enable MCP service in Features.


## Building SSE MCP Server
#### SSE MCP Server
```python
# main.py
from mcp.server.fastmcp import FastMCP
from dotenv import load_dotenv
import httpx
import json
import os
from bs4 import BeautifulSoup
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP
from starlette.applications import Starlette
from mcp.server.sse import SseServerTransport
from starlette.requests import Request
from starlette.routing import Mount, Route
from mcp.server import Server
import uvicorn
load_dotenv()
mcp = FastMCP("Agentdocs")
USER_AGENT = "Agentdocs-app/1.0"
SERPER_URL = "https://google.serper.dev/search"
docs_urls = {
"langchain": "python.langchain.com/docs",
"llama-index": "docs.llamaindex.ai/en/stable",
"autogen": "microsoft.github.io/autogen/stable",
"agno": "docs.agno.com",
"openai-agents-sdk": "openai.github.io/openai-agents-python",
"mcp-doc": "modelcontextprotocol.io",
"camel-ai": "docs.camel-ai.org",
"crew-ai": "docs.crewai.com"
}
async def search_web(query: str) -> dict | None:
payload = json.dumps({"q": query, "num": 2})
headers = {
"X-API-KEY": os.getenv("SERPER_API_KEY"),
"Content-Type": "application/json",
}
async with httpx.AsyncClient() as client:
try:
response = await client.post(
SERPER_URL, headers=headers, data=payload, timeout=30.0
)
response.raise_for_status()
return response.json()
except httpx.TimeoutException:
return {"organic": []}
async def fetch_url(url: str):
async with httpx.AsyncClient() as client:
try:
response = await client.get(url, timeout=30.0)
soup = BeautifulSoup(response.text, "html.parser")
text = soup.get_text()
return text
except httpx.TimeoutException:
return "Timeout error"
@mcp.tool()
async def get_docs(query: str, library: str):
"""
搜索给定查询和库的最新文档。
支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai。
参数:
query: 要搜索的查询 (例如 "React Agent")
library: 要搜索的库 (例如 "agno")
返回:
文档中的文本
"""
if library not in docs_urls:
raise ValueError(f"Library {library} not supported by this tool")
query = f"site:{docs_urls[library]} {query}"
results = await search_web(query)
if len(results["organic"]) == 0:
return "No results found"
text = ""
for result in results["organic"]:
text += await fetch_url(result["link"])
return text
def create_starlette_app(mcp_server: Server, *, debug: bool = False) -> Starlette:
"""Create a Starlette application that can serve the provided mcp server with SSE."""
sse = SseServerTransport("/messages/")
async def handle_sse(request: Request) -> None:
async with sse.connect_sse(
request.scope,
request.receive,
request._send, # noqa: SLF001
) as (read_stream, write_stream):
await mcp_server.run(
read_stream,
write_stream,
mcp_server.create_initialization_options(),
)
return Starlette(
debug=debug,
routes=[
Route("/sse", endpoint=handle_sse),
Mount("/messages/", app=sse.handle_post_message),
],
)
if __name__ == "__main__":
mcp_server = mcp._mcp_server
import argparse
parser = argparse.ArgumentParser(description='Run MCP SSE-based server')
parser.add_argument('--host', default='0.0.0.0', help='Host to bind to')
parser.add_argument('--port', type=int, default=8020, help='Port to listen on')
args = parser.parse_args()
# Bind SSE request handling to MCP server
starlette_app = create_starlette_app(mcp_server, debug=True)
uvicorn.run(starlette_app, host=args.host, port=args.port)
```
Launch command:
```bash
uv run main.py --host 0.0.0.0 --port 8020
```
#### MCP Client
```python
import asyncio
import json
import os
from typing import Optional
from contextlib import AsyncExitStack
import time
from mcp import ClientSession
from mcp.client.sse import sse_client
from openai import AsyncOpenAI
from dotenv import load_dotenv
load_dotenv() # load environment variables from .env
class MCPClient:
def __init__(self):
# Initialize session and client objects
self.session: Optional[ClientSession] = None
self.exit_stack = AsyncExitStack()
self.openai = AsyncOpenAI(api_key=os.getenv("OPENAI_API_KEY"), base_url=os.getenv("OPENAI_BASE_URL"))
async def connect_to_sse_server(self, server_url: str):
"""Connect to an MCP server running with SSE transport"""
# Store the context managers so they stay alive
self._streams_context = sse_client(url=server_url)
streams = await self._streams_context.__aenter__()
self._session_context = ClientSession(*streams)
self.session: ClientSession = await self._session_context.__aenter__()
# Initialize
await self.session.initialize()
# List available tools to verify connection
print("Initialized SSE client...")
print("Listing tools...")
response = await self.session.list_tools()
tools = response.tools
print("\nConnected to server with tools:", [tool.name for tool in tools])
async def cleanup(self):
"""Properly clean up the session and streams"""
if self._session_context:
await self._session_context.__aexit__(None, None, None)
if self._streams_context:
await self._streams_context.__aexit__(None, None, None)
async def process_query(self, query: str) -> str:
"""Process a query using OpenAI API and available tools"""
messages = [
{
"role": "user",
"content": query
}
]
response = await self.session.list_tools()
available_tools = [{
"type": "function",
"function": {
"name": tool.name,
"description": tool.description,
"parameters": tool.inputSchema
}
} for tool in response.tools]
# Initial OpenAI API call
completion = await self.openai.chat.completions.create(
model=os.getenv("OPENAI_MODEL"),
max_tokens=1000,
messages=messages,
tools=available_tools
)
# Process response and handle tool calls
tool_results = []
final_text = []
assistant_message = completion.choices[0].message
if assistant_message.tool_calls:
for tool_call in assistant_message.tool_calls:
tool_name = tool_call.function.name
tool_args = json.loads(tool_call.function.arguments)
# Execute tool call
result = await self.session.call_tool(tool_name, tool_args)
tool_results.append({"call": tool_name, "result": result})
final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")
# Continue conversation with tool results
messages.extend([
{
"role": "assistant",
"content": None,
"tool_calls": [tool_call]
},
{
"role": "tool",
"tool_call_id": tool_call.id,
"content": result.content[0].text
}
])
print(f"Tool {tool_name} returned: {result.content[0].text}")
print("messages", messages)
# Get next response from OpenAI
completion = await self.openai.chat.completions.create(
model=os.getenv("OPENAI_MODEL"),
max_tokens=1000,
messages=messages,
)
if isinstance(completion.choices[0].message.content, (dict, list)):
final_text.append(str(completion.choices[0].message.content))
else:
final_text.append(completion.choices[0].message.content)
else:
if isinstance(assistant_message.content, (dict, list)):
final_text.append(str(assistant_message.content))
else:
final_text.append(assistant_message.content)
return "\n".join(final_text)
async def chat_loop(self):
"""Run an interactive chat loop"""
print("\nMCP Client Started!")
print("Type your queries or 'quit' to exit.")
while True:
try:
query = input("\nQuery: ").strip()
if query.lower() == 'quit':
break
response = await self.process_query(query)
print("\n" + response)
except Exception as e:
print(f"\nError: {str(e)}")
async def main():
if len(sys.argv) < 2:
print("Usage: uv run client.py <URL of SSE MCP server (i.e. http://localhost:8080/sse)>")
sys.exit(1)
client = MCPClient()
try:
await client.connect_to_sse_server(server_url=sys.argv[1])
await client.chat_loop()
finally:
await client.cleanup()
if __name__ == "__main__":
import sys
asyncio.run(main())
```
For the client:
Launch command:
```bash
uv run client.py http://0.0.0.0:8020/sse
```
This completes the comprehensive tutorial on building MCP Server and Client from scratch with Python. For any questions or suggestions, please feel free to join our discussion group.
References:
- https://www.youtube.com/watch?v=Ek8JHgZtmcI
- https://serper.dev/
- https://modelcontextprotocol.io/quickstart/server
- https://modelcontextprotocol.io/quickstart/client
- https://docs.cursor.com/context/model-context-protocol