We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/trailofbits/mcp-context-protector'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
simple_downstream_server.py•781 B
"""
Simple downstream MCP server with an echo tool.
Uses fastmcp from the official Python SDK for MCP.
"""
from typing import Any
from mcp.server.fastmcp import FastMCP
# Echo handler function
async def echo_handler(message: str) -> dict[str, Any]:
"""
Echo handler function that returns the input message.
Args:
params: A dictionary containing the parameters from the request.
Expected to have a 'message' key with a string value.
Returns:
A dictionary with the 'echo_message' key containing the input message.
"""
return {"echo_message": message}
# Create the server
app = FastMCP()
# Register the tool
app.add_tool(echo_handler, "echo")
# Run the server if executed directly
if __name__ == "__main__":
app.run()