Skip to main content
Glama

MCP Python Server — API Wrapper

by cdryampi

MCP Python Server—API Wrapper

This project creates an MCP server in Python that exposes a tool for querying an external API. It is compatible with Claude Desktop or ChatGPT Desktop that support the Model Context Protocol (MCP).

✨ Features

  • Exposing a tool via MCP

  • HTTP query to an external API

  • Direct integration with Claude/Desktop via claude.json


🚀 Requirements

  • Python 3.9+

  • mcp[cli] (installable via pip or uv)

  • Claude or ChatGPT Desktop (with MCP support)


📁 Project structure

. ├── servidores/profile.py # Servidor MCP con herramientas para interactuar con mi backend del curriculum. ├── server.py # Servidor MCP con herramienta "consultar_api". ├── .env # Variables opcionales para auth/API. ├── claude.json # Config. MCP para integrarlo directamente. └── README.md # Este documento.

⚙️ Installation

With pip

pip install "mcp[cli]"

With UV (recommended)

uv init mcp-api-server cd mcp-api-server uv add "mcp[cli]"

Installation of the MCP

mcp install mi_script.py

Installation with .env

mcp install mi_script.py -f .env

Installation of dependencies

pip install -r requirements.txt

Environment variables

Create a .env file in the project root to define optional environment variables:

# .env API_KEY=mi_api_key API_URL=https://miapi.com/consulta

👷 Quick Start (Quickstart)

Create the server server.py

from mcp.server.fastmcp import FastMCP import httpx mcp = FastMCP("API Wrapper") @mcp.tool(description="Consulta una API externa") async def consultar_api(param: str) -> str: """Consulta una API externa con un parámetro y devuelve la respuesta.""" async with httpx.AsyncClient() as client: r = await client.get(f"https://miapi.com/consulta?param={param}") return r.text

Run locally in dev mode

mcp dev server.py

Run in production mode

mcp run server.py

Or with uv:

uv run --with mcp[cli] mcp run server.py

🚀 Integration with Claude/Desktop

Locate claude.json in the Claude/Desktop configuration folder:

  • On Windows: %APPDATA%\Claude\claude.json

  • On Linux/macOS: ~/.claude/claude.json

Example:

{ "mcpServers": { "filesystem": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-filesystem", "/codigo/backend-curso-inkor/proyectos_memes" ] }, "Demo": { "command": "uv", "args": [ "run", "--with", "mcp[cli]", "mcp", "run", "C:\\codigo\\backend-curso-inkor\\MCP\\server.py" ] } } }

🤖 Use within Claude/Desktop

You can ask the model:

Use the consult_api tool with the "ping" parameter

And the model will use your MCP server to make a real-time HTTP call.


🎁 Bonus: Tools Extension

@mcp.tool() async def traducir(texto: str, lang: str) -> str: return f"Traducido: {texto} → {lang}"

🔍 Resources


✅ Made with love and httpx 🚀

-
security - not tested
-
license - not tested
-
quality - not tested

Related MCP Servers

  • -
    security
    -
    license
    -
    quality
    A Python-based server that implements the Model Context Protocol to interface with Claude Desktop as an MCP client, supporting interaction through efficient memory management.
    Last updated -
    1
    MIT License
  • -
    security
    -
    license
    -
    quality
    A Model Context Protocol server that enables AI assistants like Claude to perform Python development tasks through file operations, code analysis, project management, and safe code execution.
    Last updated -
    5
    • Linux
    • Apple
  • -
    security
    -
    license
    -
    quality
    A streamlined foundation for building Model Context Protocol servers in Python, designed to make AI-assisted development of MCP tools easier and more efficient.
    Last updated -
    13
    MIT License
  • A
    security
    F
    license
    A
    quality
    A Python-based Model Context Protocol server that integrates with Claude Desktop, allowing users to connect to Hubble API services by configuring the server with their Hubble API key.
    Last updated -
    7
    • Apple

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cdryampi/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server