Skip to main content
Glama

invoke_agent

Execute complex queries requiring multi-tool reasoning or conversational responses by invoking the full agent with natural language prompts.

Instructions

Invoke the full strands-mcp-cli agent with a natural language prompt. Use this for complex queries that require reasoning across multiple tools or when you need a conversational response from the agent.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYesThe prompt or query to send to the agent

Implementation Reference

  • Core handler logic for the 'invoke_agent' MCP tool. Creates a fresh Agent instance mirroring the parent agent's configuration (model, tools, system prompt, etc.) with empty message history, invokes it with the user prompt, and returns the stringified response as MCP TextContent.
    if name == "invoke_agent" and expose_agent: prompt = arguments.get("prompt") if not prompt: return [ types.TextContent( type="text", text="❌ Error: 'prompt' parameter is required", ) ] logger.debug(f"Invoking agent with prompt: {prompt[:100]}...") # Get the parent agent's configuration # Access tools directly from registry dictionary tools_for_invocation = [ agent.tool_registry.registry[tool_name] for tool_name in agent_tools.keys() if tool_name in agent.tool_registry.registry ] # Prepare extra kwargs for observability and callbacks extra_kwargs = {} if hasattr(agent, "callback_handler") and agent.callback_handler: extra_kwargs["callback_handler"] = agent.callback_handler # Create fresh agent with same configuration but clean message history # Inherits: model, tools, trace_attributes, callback_handler fresh_agent = Agent( name=f"{agent.name}-invocation", model=agent.model, messages=[], # Empty message history (clean state) tools=tools_for_invocation, system_prompt=agent.system_prompt if hasattr(agent, "system_prompt") else None, trace_attributes=agent.trace_attributes if hasattr(agent, "trace_attributes") else {}, **extra_kwargs, ) # Call the fresh agent result = fresh_agent(prompt) # Extract text response from agent result response_text = str(result) logger.debug(f"Agent invocation complete, response length: {len(response_text)}") return [types.TextContent(type="text", text=response_text)]
  • Schema definition for the 'invoke_agent' tool, including name, description, and inputSchema requiring a 'prompt' string.
    agent_invoke_tool = types.Tool( name="invoke_agent", description=( f"Invoke the full {agent.name} agent with a natural language prompt. " "Use this for complex queries that require reasoning across multiple tools " "or when you need a conversational response from the agent." ), inputSchema={ "type": "object", "properties": { "prompt": { "type": "string", "description": "The prompt or query to send to the agent", } }, "required": ["prompt"], }, )
  • Conditional registration of the 'invoke_agent' tool into the mcp_tools list (returned by list_tools handler) when expose_agent=True. Also captures expose_agent variable for use in call_tool handler closure.
    if expose_agent: agent_invoke_tool = types.Tool( name="invoke_agent", description=( f"Invoke the full {agent.name} agent with a natural language prompt. " "Use this for complex queries that require reasoning across multiple tools " "or when you need a conversational response from the agent." ), inputSchema={ "type": "object", "properties": { "prompt": { "type": "string", "description": "The prompt or query to send to the agent", } }, "required": ["prompt"], }, ) mcp_tools.append(agent_invoke_tool)
  • MCP list_tools handler registration, which exposes the mcp_tools list (including 'invoke_agent' if added) to MCP clients.
    @server.list_tools() async def list_tools() -> list[types.Tool]: """Return list of available MCP tools. This handler is called when MCP clients request the available tools. It returns the pre-built list of MCP Tool objects converted from Strands agent tools. """ logger.debug(f"list_tools called, returning {len(mcp_tools)} tools") return mcp_tools

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cagataycali/strands-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server