ai_summarize
Summarize long texts into bullet points, a paragraph, or a single sentence. Choose the output mode and number of points.
Instructions
对长文本进行摘要,支持要点列表、段落摘要、一句话摘要三种模式。
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| text | Yes | 要摘要的文本 | |
| mode | No | 摘要模式: bullets(要点列表)/ paragraph(段落)/ one_line(一句话) | bullets |
| max_points | No | 要点数量(bullets 模式有效,默认 5) | |
| language | No | 输出语言(默认与原文相同) |
Implementation Reference
- Tool schema registration for ai_summarize with inputSchema defining required 'text', optional 'mode' (bullets/paragraph/one_line), 'max_points' (integer), and 'language' fields.
types.Tool( name="ai_summarize", description="对长文本进行摘要,支持要点列表、段落摘要、一句话摘要三种模式。", inputSchema={ "type": "object", "properties": { "text": { "type": "string", "description": "要摘要的文本", }, "mode": { "type": "string", "description": "摘要模式: bullets(要点列表)/ paragraph(段落)/ one_line(一句话)", "enum": ["bullets", "paragraph", "one_line"], "default": "bullets", }, "max_points": { "type": "integer", "description": "要点数量(bullets 模式有效,默认 5)", "default": 5, }, "language": { "type": "string", "description": "输出语言(默认与原文相同)", "default": "", }, }, "required": ["text"], }, ), - src/onion_mcp_server/tools/ai.py:234-251 (handler)The handler function (handle_ai) that executes ai_summarize logic: reads mode/max_points/language params, constructs LLM prompt with appropriate instruction, calls llm_call, and returns TextContent.
elif name == "ai_summarize": mode = a.get("mode", "bullets") max_points = int(a.get("max_points", 5)) lang_str = f",用{a['language']}输出" if a.get("language") else "" if mode == "bullets": instruction = f"提炼出最重要的 {max_points} 个要点,用 Markdown 列表格式输出{lang_str}" elif mode == "paragraph": instruction = f"写成一段连贯的摘要段落{lang_str}" else: # one_line instruction = f"用一句话概括核心内容{lang_str}" prompt = ( f"请对以下文本进行摘要,{instruction}。\n\n" f"---\n{a['text']}\n---" ) reply = await llm_call(prompt) return [types.TextContent(type="text", text=reply)] - src/onion_mcp_server/server.py:48-51 (registration)Server registration: ai_summarize (as part of AI_TOOLS) is mapped to the handle_ai handler via the _HANDLERS routing table.
# ── 路由表 ──────────────────────────────────────────────────── _HANDLERS: dict = {} for _t in AI_TOOLS: _HANDLERS[_t.name] = handle_ai - The llm_call helper function called by the ai_summarize handler to make the actual LLM API request.
async def llm_call( prompt: str, system: Optional[str] = None, temperature: float = 0.7, ) -> str: """单轮调用""" messages = [] if system: messages.append({"role": "system", "content": system}) messages.append({"role": "user", "content": prompt}) return await llm_chat(messages, temperature=temperature)