code_fix
Fix code errors by providing the code and optionally the error message for improved accuracy. Supports multiple languages.
Instructions
修复代码中的错误,可提供错误信息以提高准确性。
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| code | Yes | ||
| error_message | No | ||
| language | No |
Implementation Reference
- Handler logic for the 'code_fix' tool: builds a prompt to fix code errors and delegates to llm_call.
elif name == "code_fix": err_part = f"\n\n错误信息:\n```\n{a['error_message']}\n```" if a.get("error_message") else "" lang = a.get("language", "") or "" prompt = ( f"请修复以下 {lang} 代码中的错误。\n" f"要求:输出修复后的完整代码,并在代码前简要说明修复了什么问题。\n\n" f"```\n{a['code']}\n```{err_part}" ) - Schema definition for 'code_fix': requires 'code', optional 'error_message' and 'language'.
types.Tool( name="code_fix", description="修复代码中的错误,可提供错误信息以提高准确性。", inputSchema={ "type": "object", "properties": { "code": {"type": "string"}, "error_message": {"type": "string", "default": ""}, "language": {"type": "string", "default": ""}, }, "required": ["code"], }, ), - src/onion_mcp_server/server.py:52-53 (registration)Registration: maps 'code_fix' (via CODE_TOOLS) to handle_code in the server's routing table.
for _t in CODE_TOOLS: _HANDLERS[_t.name] = handle_code - Helper function 'llm_call' used by handle_code to send the prompt to the LLM.
async def llm_call( prompt: str, system: Optional[str] = None, temperature: float = 0.7, ) -> str: """单轮调用""" messages = [] if system: messages.append({"role": "system", "content": system}) messages.append({"role": "user", "content": prompt}) return await llm_chat(messages, temperature=temperature)