code_generate
Converts natural language descriptions into functional code. Supports multiple languages and styles for tailored output.
Instructions
根据自然语言描述生成代码。
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| description | Yes | 功能描述 | |
| language | No | Python | |
| style | No | simple |
Implementation Reference
- src/onion_mcp_server/server.py:52-53 (registration)Registers 'code_generate' tool name to the handle_code handler via the _HANDLERS routing table.
for _t in CODE_TOOLS: _HANDLERS[_t.name] = handle_code - Defines the 'code_generate' tool schema: name, description, inputSchema with 'description', 'language', and 'style' parameters.
types.Tool( name="code_generate", description="根据自然语言描述生成代码。", inputSchema={ "type": "object", "properties": { "description": {"type": "string", "description": "功能描述"}, "language": {"type": "string", "default": "Python"}, "style": { "type": "string", "enum": ["simple", "production"], "default": "simple", }, }, "required": ["description"], }, ), - The handler logic for 'code_generate' that builds the prompt from arguments (description, language, style) and calls llm_call to generate code.
elif name == "code_generate": style_map = { "simple": "简洁易读", "production": "生产级别(含完整错误处理、日志、类型注解)", } style_str = style_map.get(a.get("style", "simple"), "简洁易读") lang = a.get("language", "Python") prompt = ( f"请用 {lang} 编写以下功能的代码,风格要求:{style_str}。\n" f"只输出代码和必要的注释,不需要额外解释。\n\n" f"需求:{a['description']}" ) - The llm_call helper used by handle_code to send the prompt to the LLM and return the generated text.
async def llm_call( prompt: str, system: Optional[str] = None, temperature: float = 0.7, ) -> str: """单轮调用""" messages = [] if system: messages.append({"role": "system", "content": system}) messages.append({"role": "user", "content": prompt}) return await llm_chat(messages, temperature=temperature)