Skip to main content
Glama

Simple HTTP MCP Server

Simple HTTP MCP Server Implementation

This project provides a lightweight server implementation for the Model Context Protocol (MCP) over HTTP. It allows you to expose Python functions as tools and prompts that can be discovered and executed remotely via a JSON-RPC interface. It is intended to be used with a Starlette or FastAPI application (see demo).

The following badge corresponds to the example server for this project. Find it in the tests/app/ folder.

Features

  • MCP Protocol Compliant: Implements the MCP specification for tool and prompts discovery and execution. No support for notifications.
  • HTTP and STDIO Transport: Uses HTTP (POST requests) or STDIO for communication.
  • Async Support: Built on Starlette or FastAPI for asynchronous request handling.
  • Type-Safe: Leverages Pydantic for robust data validation and serialization.
  • Server State Management: Access shared state through the lifespan context using the get_state_key method.
  • Request Access: Access the incoming request object from your tools and prompts.

Server Architecture

The library provides a single MCPServer class that uses lifespan to manage shared state across the entire application lifecycle.

MCPServer

The MCPServer is designed to work with Starlette's lifespan system for managing shared server state.

Key Characteristics:

  • Lifespan Based: Uses Starlette's lifespan events to initialize and manage shared server state
  • Application-Level State: State persists across the entire application lifecycle, not per-request
  • Flexible: Can be used with any custom context class stored in the lifespan state

Example Usage:

import contextlib from collections.abc import AsyncIterator from typing import TypedDict from dataclasses import dataclass, field from starlette.applications import Starlette from http_mcp.server import MCPServer @dataclass class Context: call_count: int = 0 user_preferences: dict = field(default_factory=dict) class State(TypedDict): context: Context @contextlib.asynccontextmanager async def lifespan(_app: Starlette) -> AsyncIterator[State]: yield {"context": Context()} mcp_server = MCPServer( name="my-server", version="1.0.0", tools=my_tools, prompts=my_prompts ) app = Starlette(lifespan=lifespan) app.mount("/mcp", mcp_server.app)

Tools

Tools are the functions that can be called by the client.

Example:

  1. Define the arguments and output for the tools:
# app/tools/models.py from pydantic import BaseModel, Field class GreetInput(BaseModel): question: str = Field(description="The question to answer") class GreetOutput(BaseModel): answer: str = Field(description="The answer to the question") # Note: the description on Field will be passed when listing the tools. # Having a description is optional, but it's recommended to provide one.
  1. Define the tools:
# app/tools/tools.py from http_mcp.types import Arguments from app.tools.models import GreetInput, GreetOutput def greet(args: Arguments[GreetInput]) -> GreetOutput: return GreetOutput(answer=f"Hello, {args.inputs.question}!")
# app/tools/__init__.py from http_mcp.types import Tool from app.tools.models import GreetInput, GreetOutput from app.tools.tools import greet TOOLS = ( Tool( func=greet, inputs=GreetInput, output=GreetOutput, ), ) __all__ = ["TOOLS"]
  1. Instantiate the server:
# app/main.py from starlette.applications import Starlette from http_mcp.server import MCPServer from app.tools import TOOLS mcp_server = MCPServer(tools=TOOLS, name="test", version="1.0.0") app = Starlette() app.mount( "/mcp", mcp_server.app, )

Server State Management

The server uses Starlette's lifespan system to manage shared state across the entire application lifecycle. State is initialized when the application starts and persists until it shuts down. Context is accessed through the get_state_key method on the Arguments object.

Example:

  1. Define a context class:
from dataclasses import dataclass, field # app/context.py @dataclass class Context: called_tools: list[str] = field(default_factory=list) def get_called_tools(self) -> list[str]: return self.called_tools def add_called_tool(self, tool_name: str) -> None: self.called_tools.append(tool_name)
  1. Set up the application with lifespan:
import contextlib from collections.abc import AsyncIterator from typing import TypedDict from starlette.applications import Starlette from app.context import Context from http_mcp.server import MCPServer class State(TypedDict): context: Context @contextlib.asynccontextmanager async def lifespan(_app: Starlette) -> AsyncIterator[State]: yield {"context": Context(called_tools=[])} mcp_server = MCPServer( tools=TOOLS, name="test", version="1.0.0", ) app = Starlette(lifespan=lifespan) app.mount("/mcp", mcp_server.app)
  1. Access the context in your tools:
from pydantic import BaseModel, Field from http_mcp.types import Arguments from app.context import Context class MyToolArguments(BaseModel): question: str = Field(description="The question to answer") class MyToolOutput(BaseModel): answer: str = Field(description="The answer to the question") async def my_tool(args: Arguments[MyToolArguments]) -> MyToolOutput: # Access the context from lifespan state context = args.get_state_key("context", Context) context.add_called_tool("my_tool") ... return MyToolOutput(answer=f"Hello, {args.inputs.question}!")

Request Access

You can access the incoming request object from your tools. The request object is passed to each tool call and can be used to access headers, cookies, and other request data (e.g. request.state, request.scope).

from pydantic import BaseModel, Field from http_mcp.types import Arguments class MyToolArguments(BaseModel): question: str = Field(description="The question to answer") class MyToolOutput(BaseModel): answer: str = Field(description="The answer to the question") async def my_tool(args: Arguments[MyToolArguments]) -> MyToolOutput: # Access the request auth_header = args.request.headers.get("Authorization") ... return MyToolOutput(answer=f"Hello, {args.inputs.question}!") # Use MCPServer: from http_mcp.server import MCPServer mcp_server = MCPServer( name="my-server", version="1.0.0", tools=(my_tool,), )

Prompts

You can add interactive templates that are invoked by user choice. Prompts now support lifespan state access, similar to tools.

  1. Define the arguments for the prompts:
from pydantic import BaseModel, Field from http_mcp.mcp_types.content import TextContent from http_mcp.mcp_types.prompts import PromptMessage from http_mcp.types import Arguments, Prompt class GetAdvice(BaseModel): topic: str = Field(description="The topic to get advice on") include_actionable_steps: bool = Field( description="Whether to include actionable steps in the advice", default=False ) def get_advice(args: Arguments[GetAdvice]) -> tuple[PromptMessage, ...]: """Get advice on a topic.""" template = """ You are a helpful assistant that can give advice on {topic}. """ if args.inputs.include_actionable_steps: template += """ The advice should include actionable steps. """ return ( PromptMessage( role="user", content=TextContent( text=template.format(topic=args.inputs.topic) ), ), ) PROMPTS = ( Prompt( func=get_advice, arguments_type=GetAdvice, ), )
  1. Using prompts with lifespan state:
from pydantic import BaseModel, Field from http_mcp.mcp_types.content import TextContent from http_mcp.mcp_types.prompts import PromptMessage from http_mcp.types import Arguments, Prompt from app.context import Context class GetAdvice(BaseModel): topic: str = Field(description="The topic to get advice on") def get_advice_with_context(args: Arguments[GetAdvice]) -> tuple[PromptMessage, ...]: """Get advice on a topic with context awareness.""" # Access the context from lifespan state context = args.get_state_key("context", Context) called_tools = context.get_called_tools() template = """ You are a helpful assistant that can give advice on {topic}. Previously called tools: {tools} """ return ( PromptMessage( role="user", content=TextContent( text=template.format( topic=args.inputs.topic, tools=", ".join(called_tools) if called_tools else "none" ) ) ), ) PROMPTS_WITH_CONTEXT = ( Prompt( func=get_advice_with_context, arguments_type=GetAdvice, ), )
  1. Instantiate the server:
from starlette.applications import Starlette from app.prompts import PROMPTS from http_mcp.server import MCPServer app = Starlette() mcp_server = MCPServer(tools=(), prompts=PROMPTS, name="test", version="1.0.0") app.mount( "/mcp", mcp_server.app, )
Deploy Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A lightweight server implementation that exposes Python functions as discoverable tools via HTTP using the Machine-to-Machine Communication Protocol (MCP). Enables remote execution of Python functions through a JSON-RPC interface with async support and type safety.

  1. Features
    1. Server Architecture
      1. MCPServer
    2. Tools
      1. Server State Management
        1. Request Access
          1. Prompts

            Related MCP Servers

            • -
              security
              F
              license
              -
              quality
              This is an MCP server that facilitates building tools for interacting with various APIs and workflows, supporting Python-based development with potential for customizable prompts and user configurations.
              Last updated -
            • A
              security
              A
              license
              A
              quality
              A server that provides a persistent Python REPL environment through the MCP protocol, allowing execution of Python code, variable management, and package installation.
              Last updated -
              3
              26
              MIT License
            • -
              security
              F
              license
              -
              quality
              A Python implementation of the MCP server that enables AI models to connect with external tools and data sources through a standardized protocol, supporting tool invocation and resource access via JSON-RPC.
              Last updated -
              1
            • -
              security
              F
              license
              -
              quality
              A Python template for building Model Context Protocol (MCP) servers that expose tools via JSON-RPC, enabling secure and scalable context and tool invocation for language models.
              Last updated -

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/yeison-liscano/http_mcp'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server