Skip to main content
Glama

exec_mcp_tool

Execute specified tools on target MCP servers by providing server name, tool name, and parameters. Enables remote tool execution and service discovery through the MCP Router.

Instructions

Execute a tool on a target MCP server. Args: target_server_name: Name of the target server target_tool_name: Name of the tool to execute parameters: Parameters for the tool

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
parametersYes
target_server_nameYes
target_tool_nameYes

Implementation Reference

  • The core handler function for executing 'exec_mcp_tool'. It is registered via @mcp.tool() decorator. Retrieves target server endpoint from discovery service and calls appropriate helper (_execute_sse_tool or _execute_http_tool) based on endpoint type.
    @mcp.tool() async def exec_mcp_tool( target_server_name: str, target_tool_name: str, parameters: Dict[str, Any] ) -> str: """ Execute a tool on a target MCP server. Args: target_server_name: Name of the target server target_tool_name: Name of the tool to execute parameters: Parameters for the tool """ try: # Get target server endpoint target_endpoint = discovery_service.get_server_endpoint(target_server_name) # Check if this is an SSE service (like AMap) if "sse" in target_endpoint.lower() and HAVE_SSE_SUPPORT: # Use SSE connection for services that require it return await _execute_sse_tool(target_endpoint, target_tool_name, parameters) else: # Use HTTP POST for standard MCP services return await _execute_http_tool(target_endpoint, target_tool_name, parameters) except Exception as e: return json.dumps({"error": str(e)}, ensure_ascii=False)
  • Pydantic BaseModel defining the input schema (parameters) for the exec_mcp_tool tool, matching the function signature.
    class ExecToolRequest(BaseModel): target_server_name: str target_tool_name: str parameters: Dict[str, Any]
  • Helper function for executing tools on SSE-based MCP servers using ClientSession and sse_client.
    async def _execute_sse_tool(endpoint: str, tool_name: str, parameters: Dict[str, Any]) -> str: """ Execute a tool on a target MCP server using SSE connection. Args: endpoint: The SSE endpoint URL tool_name: Name of the tool to execute parameters: Parameters for the tool """ exit_stack = AsyncExitStack() try: # Create SSE client sse_cm = sse_client(endpoint) streams = await exit_stack.enter_async_context(sse_cm) # Create session session_cm = ClientSession(streams[0], streams[1]) session = await exit_stack.enter_async_context(session_cm) # Initialize session await session.initialize() # Execute tool result = await session.call_tool(tool_name, parameters) # Convert result to dict if it's a CallToolResult object if hasattr(result, '_asdict'): result = result._asdict() elif hasattr(result, '__dict__'): result = result.__dict__ # Return result as JSON return json.dumps(result, ensure_ascii=False, default=str) finally: # Clean up await exit_stack.aclose()
  • Helper function for executing tools on standard MCP servers via HTTP POST with JSON-RPC 'tools/call' method.
    async def _execute_http_tool(endpoint: str, tool_name: str, parameters: Dict[str, Any]) -> str: """ Execute a tool on a target MCP server using HTTP POST. Args: endpoint: The HTTP endpoint URL tool_name: Name of the tool to execute parameters: Parameters for the tool """ # Construct JSON-RPC request json_rpc_request = { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": tool_name, "arguments": parameters }, "id": 1 } # Send request to target server async with httpx.AsyncClient() as client: response = await client.post( endpoint, json=json_rpc_request, headers={"Content-Type": "application/json"} ) # Return the response from the target server response_data = response.json() return json.dumps(response_data, ensure_ascii=False)

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Maverick-LjXuan/mcp-router'

If you have feedback or need assistance with the MCP directory API, please join our Discord server