MCP Weather Server
Enables a travel planning agent that combines MCP tool execution with OpenAI function calling, translating MCP tool schemas into OpenAI function definitions to allow GPT to reason over real-time weather data and generate travel itineraries.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Weather Serverwhat's the weather like in Tokyo today?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Weather Server & Travel Agent
A learning project that builds a local Model Context Protocol (MCP) server backed by the free OpenWeatherMap API, a diagnostic MCP client, and an AI-powered travel-planning agent that combines MCP tool execution with OpenAI function calling.
Business Context
The project answers a practical question: How can an LLM fetch live, external data — and act on it — through a standardised protocol?
MCP lets any AI host (Cursor, Claude Desktop, custom agents) discover and invoke server-provided Tools, read Resources, and retrieve reusable Prompt templates without hard-coding integrations. This project wires that idea end-to-end using real-time weather data.
What you can do with it
Capability | Example |
Ask Cursor for live weather | "What's the weather in Tokyo?" — Cursor calls |
Generate a travel plan |
|
Explore all three MCP primitives |
|
Project Structure
MCP2/
├── server.py # MCP server — exposes tools, resources, prompts
├── client.py # Diagnostic MCP client — exercises every primitive
├── travel_agent.py # AI travel agent — MCP + OpenAI function calling
├── main.py # Scaffold entry point (placeholder)
├── .env # API keys (gitignored)
├── .python-version # Pins Python 3.14
├── pyproject.toml # Project metadata & dependencies (managed by uv)
├── uv.lock # Locked dependency graph
├── .gitignore
└── .cursor/
└── mcp.json # Cursor IDE MCP server configArchitecture & End-to-End Flows
Component Overview
┌─────────────────────────────────────────────────────────────────┐
│ MCP HOST / CLIENT │
│ │
│ ┌──────────┐ ┌──────────────┐ ┌────────────────────────┐ │
│ │ Cursor │ │ client.py │ │ travel_agent.py │ │
│ │ IDE │ │ (diagnostic) │ │ (AI agent) │ │
│ └────┬─────┘ └──────┬───────┘ └───────────┬────────────┘ │
│ │ │ │ │
│ │ stdio │ stdio │ stdio │
│ └────────┬───────┘────────────────────────┘ │
│ │ │
├────────────────┼────────────────────────────────────────────────┤
│ ▼ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ server.py (FastMCP) │ │
│ │ │ │
│ │ TOOLS RESOURCES PROMPTS │ │
│ │ ───── ───────── ─────── │ │
│ │ get_weather weather://cities weather_report │ │
│ │ get_forecast weather://help travel_advisory │ │
│ │ │ │
│ │ fetch_weather(endpoint, params) │ │
│ └──────────────────────┬────────────────────────────────┘ │
│ │ httpx (async) │
│ ▼ │
│ ┌──────────────────────────────┐ │
│ │ OpenWeatherMap REST API │ │
│ │ /weather /forecast │ │
│ └──────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘Flow 1 — Cursor IDE (interactive usage)
User types in Cursor:
"What's the weather in Bangalore?"
1. Cursor reads .cursor/mcp.json
2. Spawns `uv run server.py` as a child process (stdio transport)
3. MCP handshake: Cursor sends `initialize`, server responds with capabilities
4. Cursor discovers tools via `tools/list` → learns about get_weather, get_forecast
5. The LLM decides get_weather(city="Bangalore") is needed
6. Cursor sends `tools/call` → server receives request
7. server.py → fetch_weather("weather", {"q": "Bangalore"})
→ httpx GET https://api.openweathermap.org/data/2.5/weather?q=Bangalore&appid=…&units=metric
8. OpenWeatherMap returns JSON → server formats a string → returns via MCP
9. Cursor displays the result to the userFlow 2 — Diagnostic Client (uv run client.py)
The client runs a single, linear script that exercises every MCP primitive:
1. Launch server.py as subprocess via stdio_client(StdioServerParameters)
2. ClientSession handshake → session.initialize()
┌──────────────────────────────────────────────────┐
│ TOOLS │
│ a. list_tools() → enumerate all tools │
│ b. call_tool("get_weather", {city: "Toronto"}) │
│ c. call_tool("get_forecast", {city, days: 3}) │
├──────────────────────────────────────────────────┤
│ RESOURCES │
│ a. list_resources() → enumerate all resources │
│ b. read_resource("weather://cities") │
│ c. read_resource("weather://help") │
├──────────────────────────────────────────────────┤
│ PROMPTS │
│ a. list_prompts() → enumerate all prompts │
│ b. get_prompt("weather_report", {city: "Tokyo"}) │
│ c. get_prompt("travel_advisory", {city, days}) │
└──────────────────────────────────────────────────┘
3. Results printed to terminal; process exitsFlow 3 — Travel Planner Agent (uv run travel_agent.py "Tokyo" 5)
This is the most complex flow. It implements an agentic loop where GPT decides which MCP tools to call and when to stop.
1. Parse CLI args (city, days)
2. Connect to Weather MCP server via stdio (same as client.py)
3. MCP handshake → session.initialize()
4. Tool discovery:
a. session.list_tools() → get MCP tool definitions
b. mcp_tools_to_openai_functions() → translate MCP inputSchema
into OpenAI function-calling format
5. Resource prefetch:
a. session.list_resources()
b. Read all resources (cities list, help text) → inject as system context
6. Build initial message list:
┌────────────────────────────────────────────────────────┐
│ system: SYSTEM_PROMPT (travel planner personality) │
│ system: resource context (cities list, help text) │
│ user: "Plan a 5-day trip to Tokyo. Use the │
│ weather tools to check conditions..." │
└────────────────────────────────────────────────────────┘
7. Agent reasoning loop (max 4 iterations):
┌───────────────────────────────────────────────────────────┐
│ LOOP START │
│ │
│ a. Send messages + tool definitions to OpenAI API │
│ → openai_client.chat.completions.create( │
│ model, messages, tools, tool_choice="auto") │
│ │
│ b. IF response contains tool_calls: │
│ ┌─────────────────────────────────────────────────┐ │
│ │ For each tool_call: │ │
│ │ - Parse function name + arguments │ │
│ │ - Execute via MCP: │ │
│ │ session.call_tool(name, args) │ │
│ │ - Extract text from MCP result │ │
│ │ - Append tool result to messages │ │
│ └─────────────────────────────────────────────────┘ │
│ → Continue loop (LLM sees tool results next round) │
│ │
│ c. ELSE IF response contains text content: │
│ → This is the FINAL answer. Print travel plan. BREAK │
│ │
│ d. ELSE (empty response): │
│ → Retry │
│ │
│ LOOP END │
└───────────────────────────────────────────────────────────┘
8. Print travel plan with iteration countTypical execution: 3 LLM calls — (1) LLM requests get_weather, (2) LLM
requests get_forecast, (3) LLM produces the final travel plan using both results.
MCP Primitives Reference
Tools (callable functions)
Tool | Parameters | Returns |
|
| Current temp, feels-like, humidity, wind, description |
|
| One line per day with temp and description |
Both call fetch_weather() internally, which appends the API key and units=metric,
then makes an async httpx.GET to the OpenWeatherMap endpoint. Errors are caught
and returned as user-friendly strings (not exceptions).
Resources (read-only context)
URI | Description |
| JSON array of 10 example city names |
| Human-readable help text listing all capabilities |
Resources are synchronous and return static/computed strings.
Prompts (reusable templates)
Prompt | Parameters | What it generates |
|
| Instruction for the LLM to fetch weather + forecast and summarise |
|
| Instruction for the LLM to build a packing/activity/warning list |
Prompts return instruction strings; they do not call tools themselves.
Note: MCP prompt arguments are
dict[str, str]on the wire. When callingget_prompt(...)from the Python SDK, pass all values as strings (e.g.{"days": "5"}not{"days": 5}).
Tech Stack
Layer | Technology |
Language | Python 3.14 |
Package manager | |
MCP SDK |
|
HTTP client |
|
LLM (travel agent) | OpenAI |
Config loading |
|
External API | OpenWeatherMap 2.5 (free tier) |
Prerequisites
Python >= 3.14
uv installed and on your PATH
OpenWeatherMap API key — free at https://openweathermap.org/appid
OpenAI API key (only needed for
travel_agent.py)
Setup
Install dependencies:
uv syncCreate
.envin the project root:
OPENWEATHER_API_KEY=your_openweather_key
OPENWEATHER_BASE_URL=https://api.openweathermap.org/data/2.5
OPENAI_API_KEY=sk-... # only needed for travel_agent.py
.envis gitignored. Never commit API keys.
Running
MCP Server (standalone / via Cursor)
uv run server.pyThe server starts on stdio and waits for MCP messages. You don't interact with it directly in the terminal — it's designed to be driven by an MCP host (Cursor, the client, or the travel agent).
Diagnostic Client
uv run client.pyRuns through tools, resources, and prompts sequentially and prints results.
Travel Planner Agent
uv run travel_agent.py "Tokyo" 5
uv run travel_agent.py "Paris" 3
uv run travel_agent.py "Toronto" # defaults to 3 daysConnects to the MCP server, lets GPT reason with real weather data, and outputs a day-by-day travel plan.
MCP Inspector (interactive debugging)
npx @modelcontextprotocol/inspector uv run server.pyOpens a web UI to manually invoke tools, read resources, and test prompts.
Cursor Integration
Project-level config lives at .cursor/mcp.json:
{
"mcpServers": {
"weather": {
"type": "stdio",
"command": "uv",
"args": ["run", "${workspaceFolder}/server.py"],
"envFile": "${workspaceFolder}/.env"
}
}
}${workspaceFolder}is resolved by Cursor to the directory containing.cursor/mcp.json.envFileinjects.envvariables into the spawned server process.After editing this file, reload MCP servers or restart Cursor.
Once loaded, Cursor's agent can call get_weather and get_forecast directly
when you ask weather-related questions in chat.
Key Technical Details
Transport: All three clients (Cursor,
client.py,travel_agent.py) connect over stdio. The server is launched as a subprocess; MCP messages flow over stdin/stdout as JSON-RPC.Async throughout:
server.pyusesasync deffor tool handlers andhttpx.AsyncClientfor non-blocking HTTP.client.pyandtravel_agent.pyrun insideasyncio.run().MCP → OpenAI schema translation:
travel_agent.pyconverts MCPinputSchema(JSON Schema) to OpenAI'stools[].function.parametersformat. The schemas are nearly identical by design.Agent loop safety: The travel agent caps the reasoning loop at 4 iterations to prevent runaway API calls. Typical runs complete in 2–3 iterations.
Error handling: Tool handlers catch
httpx.HTTPStatusErrorand generic exceptions, returning error strings instead of raising. This keeps the MCP session alive even if the upstream API fails.Forecast de-duplication: OpenWeatherMap's
/forecastreturns 3-hour intervals.get_forecastdeduplicates by date, picking the first entry per calendar day.
Gotchas
get_prompt(...)arguments must be strings in the Python MCP SDK (e.g.{"days": "5"}not5).The free OpenWeatherMap tier has rate limits (~60 calls/min). The travel agent makes 2 API calls per run.
travel_agent.pyrequiresOPENAI_API_KEYin.env. The server and diagnostic client do not.
License
No license specified.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/deepcodes7/MCPServerWeather'
If you have feedback or need assistance with the MCP directory API, please join our Discord server