expert_model
Communicate with specialized experts to solve complex problems through collaborative workflows in multi-agent systems.
Instructions
Use this tool to communicate with an expert.
Args: name: The name of the expert to communicate with. Required. instructions: The instructions to send to the expert. Required. output: The answer from the expert based on the instructions. Required. iteration: The number of experts you have consulted so far. Start with 1.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | ||
| instructions | Yes | ||
| output | Yes | ||
| iteration | Yes |
Implementation Reference
- src/meta_prompt/server.py:118-139 (handler)The core handler implementation for the 'expert_model' MCP tool. This async function uses the provided instructions to sample a response from the model context (ctx.sample), simulating consultation with a named expert, and appends a next_action prompt.@mcp.tool() async def expert_model( name: str, instructions: str, output: str, iteration: int, ctx: Context ) -> str: """ Use this tool to communicate with an expert. Args: name: The name of the expert to communicate with. Required. instructions: The instructions to send to the expert. Required. output: The answer from the expert based on the instructions. Required. iteration: The number of experts you have consulted so far. Start with 1. """ next_action = "Based on the information given, what are the most logical next steps or conclusions? Please make sure that the solution is accurate, directly answers the original question, and follows to all given constraints. Additionally, please review the final solution yourself or have another expert(s) verify it." try: output = await ctx.sample(instructions) return f"{output}\n\n{next_action}" except Exception: print("Client doesn't support sampling, using the output directly") return next_action