Skip to main content
Glama

MCP Tool Server

A lightweight Model Context Protocol (MCP) Tool Server built in Python that exposes real-world tools (weather, stock data, and internet search) over STDIO transport. The server is designed to be discovered and invoked by MCP-compatible clients and future AI agents, and is validated using the official MCP Inspector.


๐Ÿš€ Overview

This project demonstrates how to build a correct and production-aligned MCP server that:

  • Exposes reusable tools via MCP

  • Integrates real external APIs

  • Uses clear tool schemas and contracts

  • Separates protocol logic from backend logic

  • Can be directly consumed by AI agents in the future

The focus of this project is the tool layer, not agent reasoning. It intentionally stops at the MCP boundary.


๐Ÿง  Architecture & Approach

The design follows a clean separation of responsibilities:

MCP Client / Inspector โ”‚ โ”‚ (STDIO) โ–ผ MCP Server (server/main.py) โ”‚ โ”œโ”€โ”€ Tool Definitions (server/tools/) โ”‚ โ”œโ”€โ”€ Weather Tool โ”‚ โ”œโ”€โ”€ Stock Price Tool โ”‚ โ””โ”€โ”€ Web Search Tool โ”‚ โ””โ”€โ”€ Backend Logic (server/backend/data_store.py) โ”œโ”€โ”€ OpenWeather API โ”œโ”€โ”€ Stooq Market Data โ””โ”€โ”€ Google Custom Search
  • MCP Server handles protocol wiring and tool registration

  • Tools define schemas and execution boundaries

  • Backend layer contains all external API logic

  • No agent logic is included (by design)

This mirrors how real AI platforms expose tools internally.


๐Ÿ› ๏ธ Tools Implemented

๐ŸŒฆ๏ธ Weather Tool (get_weather)

  • Fetches real-time weather data by city

  • Powered by OpenWeatherMap

  • Returns structured, agent-friendly JSON

๐Ÿ“ˆ Stock Price Tool (get_stock_price)

  • Retrieves stock market data for a given symbol

  • Uses Stooq public market data (no API key required)

  • Automatically normalizes symbols (e.g. AAPL โ†’ aapl.us)

๐ŸŒ Web Search Tool (web_search)

  • Performs internet search using Google Custom Search

  • Uses official Google APIs (no scraping)

  • Returns clean search results with title, snippet, and link

  • Result count is relevance-based and API-controlled


๐Ÿ“‚ Project Structure

TASK1-MCP-SERVER โ”‚ โ”œโ”€โ”€ client/ โ”‚ โ””โ”€โ”€ mcp_client.py โ”‚ โ”œโ”€โ”€ server/ โ”‚ โ”œโ”€โ”€ main.py # MCP server entry point โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ backend/ โ”‚ โ”‚ โ””โ”€โ”€ data_store.py # External API integrations โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ tools/ โ”‚ โ”‚ โ”œโ”€โ”€ get_weather.py โ”‚ โ”‚ โ”œโ”€โ”€ get_stock_price.py โ”‚ โ”‚ โ””โ”€โ”€ web_search.py โ”‚ โ”œโ”€โ”€ .env # API keys (not committed) โ”œโ”€โ”€ requirements.txt โ””โ”€โ”€ README.md

โš™๏ธ Prerequisites

  • Python 3.10+

  • Node.js (for MCP Inspector)

  • OpenWeatherMap API key

  • Google Custom Search API key + CSE ID


๐Ÿ“ฆ Installation

# Clone repository git clone <your-repo-url> cd TASK1-MCP-SERVER # Create virtual environment python -m venv venv source venv/bin/activate # Windows: venv\Scripts\activate # Install dependencies pip install -r requirements.txt

๐Ÿ” Environment Configuration

Create a .env file in the project root:

OPENWEATHER_API_KEY=your_openweather_api_key GOOGLE_API_KEY=your_google_api_key GOOGLE_CSE_ID=your_custom_search_engine_id

Best practices:

  • .env is excluded from version control

  • No secrets are hardcoded

  • Server fails safely if keys are missing


โ–ถ๏ธ Running the MCP Server

Start the server using the MCP Inspector:

npx @modelcontextprotocol/inspector python server/main.py

The server runs over STDIO and exposes all tools automatically.


๐Ÿงช Testing with MCP Inspector

Using the Inspector UI:

  1. Select STDIO transport

  2. Point to server/main.py

  3. Start the server

  4. Invoke tools interactively

Example Tool Invocations

Input:

{ "query": "Model Context Protocol MCP", "num_results": 5 }

Output:

{ "results": [ { "title": "Model Context Protocol Documentation", "link": "https://modelcontextprotocol.io/", "snippet": "The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs..." }, { "title": "Introducing the Model Context Protocol", "link": "https://anthropic.com/news/model-context-protocol", "snippet": "Today, we're introducing the Model Context Protocol (MCP), a new standard for connecting AI assistants..." } ] }

๐ŸŒฆ๏ธ Weather Tool

Input:

{ "city": "San Francisco" }

Output:

{ "city": "San Francisco", "temperature": 18.5, "conditions": "Clear sky", "humidity": 65, "wind_speed": 3.5 }

๐Ÿ“ˆ Stock Price Tool

Input:

{ "symbol": "AAPL" }

Output:

{ "symbol": "AAPL", "price": 195.89, "currency": "USD", "timestamp": "2024-01-15T16:00:00", "change": "+2.34", "change_percent": "+1.21%" }

๐Ÿšง Why STDIO Transport?

STDIO was chosen because it provides the simplest, most reliable path for tool development and validation.

What STDIO Gives You

  • Zero configuration: No ports, no networking, no HTTP servers to manage

  • Perfect for inspection: MCP Inspector works flawlessly with STDIO

  • Deterministic lifecycle: Process starts when called, exits when done

  • Secure by default: No exposed endpoints or security concerns

  • Easy debugging: Direct input/output makes testing and troubleshooting straightforward

Why Not HTTP/SSE?

While HTTP and Server-Sent Events (SSE) transports are valid MCP options, they introduce unnecessary complexity for a tool server:

HTTP Transport Issues:

  • Requires managing a persistent web server alongside MCP logic

  • Adds lifecycle complexity (when to start/stop, connection pooling, etc.)

  • Makes local testing harder - you need HTTP clients, manage ports, handle CORS

  • Overkill for simple tool execution that doesn't need persistent connections

SSE Transport Issues:

  • Designed for streaming real-time updates, not one-shot tool calls

  • Requires long-lived connections and complex client-side stream handling

  • Harder to debug tool execution due to streaming semantics

  • More complex error recovery and retry logic

  • Inspector support is less mature

When to Use Other Transports

  • HTTP: When you need remote deployment or multiple clients calling the server simultaneously

  • SSE: When building streaming AI agents that need real-time, progressive responses

For a foundational tool server focused on correctness and reliability, STDIO is the right choice. You can always add HTTP transport later without changing any tool implementations.


๐Ÿ’ก What This Project Intentionally Excludes

  • AI agent logic

  • LangChain / LangGraph workflows

  • RAG pipelines

  • Memory or planning systems

Those layers are meant to sit on top of this server, not inside it.


๐Ÿ”ฎ Future Extensions

This server can be extended with:

  • AI agents that dynamically discover and call tools

  • LangChain or LangGraph integration

  • RAG pipelines grounded in web search

  • Stateful or memory-based agents

  • HTTP transport for remote deployment

No changes to existing tools are required.


โœ… Key Takeaways

  • Correct MCP server implementation using STDIO transport

  • Real external integrations (weather, stocks, search)

  • Clean tool contracts with clear input/output schemas

  • Production-style separation of concerns

  • Agent-ready foundation that can scale to complex workflows


๐Ÿ“š Additional Resources


-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/sushma9903/mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server