code_review
Analyze code to detect bugs, security vulnerabilities, performance issues, and style problems. Optionally specify programming language and focus areas.
Instructions
审查代码,找出 Bug、安全漏洞、性能问题和代码风格问题。
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| code | Yes | 要审查的代码 | |
| language | No | 编程语言 | |
| focus | No |
Implementation Reference
- Handler for the code_review tool: builds a prompt instructing the LLM to review the code focusing on bug, security, performance, and/or style issues, then calls llm_call.
elif name == "code_review": focus = a.get("focus", ["all"]) focus_str = "所有方面(Bug、安全、性能、代码风格)" if "all" in focus else "、".join(focus) lang = a.get("language", "") or "" prompt = ( f"请对以下 {lang} 代码进行代码审查,重点关注:{focus_str}。\n" f"请按以下格式输出:\n" f"1. 问题列表(严重程度 + 描述 + 修复建议)\n" f"2. 整体评分(1-10)\n" f"3. 改进后的代码(如有必要)\n\n" f"```\n{a['code']}\n```" ) - Schema definition for code_review tool: requires 'code', optional 'language' and 'focus' (array with choices bug/security/performance/style/all).
types.Tool( name="code_review", description="审查代码,找出 Bug、安全漏洞、性能问题和代码风格问题。", inputSchema={ "type": "object", "properties": { "code": {"type": "string", "description": "要审查的代码"}, "language": {"type": "string", "description": "编程语言", "default": ""}, "focus": { "type": "array", "items": {"type": "string", "enum": ["bug", "security", "performance", "style", "all"]}, "default": ["all"], }, }, "required": ["code"], }, ), - src/onion_mcp_server/server.py:52-53 (registration)Registration: code_review (as part of CODE_TOOLS) is mapped to the handle_code handler function in the _HANDLERS routing table.
for _t in CODE_TOOLS: _HANDLERS[_t.name] = handle_code - Helper: llm_call utility invoked by handle_code to send the constructed prompt to the LLM and return the response.
async def llm_call( prompt: str, system: Optional[str] = None, temperature: float = 0.7, ) -> str: """单轮调用""" messages = [] if system: messages.append({"role": "system", "content": system}) messages.append({"role": "user", "content": prompt}) return await llm_chat(messages, temperature=temperature)