Skip to main content
Glama

topic_based_summary

Generate concise, topic-focused summaries from provided content using a targeted query. Ideal for extracting key insights and relevant information efficiently.

Instructions

主题汇总功能 - 基于给定资料和查询主题,返回最相关的内容总结(2k字符内)

Args:
    content: 资料内容
    query: 查询的主题或问题

Returns:
    基于主题的相关内容总结

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
contentYes
queryYes

Implementation Reference

  • The primary handler for the 'topic_based_summary' tool, registered via @mcp.tool() decorator. It wraps the RAGProcessor's topic_summary method, handling context logging and errors.
    @mcp.tool()
    async def topic_based_summary(content: str, query: str, ctx: Context) -> str:
        """
        主题汇总功能 - 基于给定资料和查询主题,返回最相关的内容总结(2k字符内)
        
        Args:
            content: 资料内容
            query: 查询的主题或问题
        
        Returns:
            基于主题的相关内容总结
        """
        try:
            ctx.info(f"开始主题汇总,查询: {query}")
            
            summary = await rag_processor.topic_summary(content, query)
            
            ctx.info("主题汇总完成")
            return summary
            
        except Exception as e:
            logger.error(f"主题汇总失败: {e}")
            return f"主题汇总失败: {str(e)}"
  • Core helper function in RAGProcessor class that performs the topic-based summarization by crafting a specialized prompt for the LLM and generating the response.
        async def topic_summary(self, content: str, query: str) -> str:
            """
            基于主题查询的内容总结
            
            Args:
                content: 资料内容
                query: 查询主题/问题
                
            Returns:
                相关内容的总结(2k字符内)
            """
            try:
                # 构建RAG样式的提示词
                prompt = f"""基于以下资料内容,针对查询主题进行总结分析:
    
    查询主题: {query}
    
    资料内容:
    {content}
    
    请根据查询主题,从资料中提取最相关的信息,并总结为2000字以内的内容。要求:
    1. 重点关注与查询主题相关的内容
    2. 保持信息的准确性和逻辑性
    3. 如果资料中没有相关信息,请明确说明
    4. 提供具体的细节和要点"""
    
                response = self.summarizer.client.chat.completions.create(
                    model=OPENAI_MODEL,
                    messages=[
                        {"role": "system", "content": "你是一个专业的信息分析师,擅长从大量资料中提取特定主题的相关信息。"},
                        {"role": "user", "content": prompt}
                    ],
                    max_tokens=min(MAX_OUTPUT_TOKENS, 120000),  # 限制在120k左右
                    temperature=0.1
                )
                
                result = response.choices[0].message.content.strip()
                
                return result
                
            except Exception as e:
                logger.error(f"主题汇总失败: {e}")
                return f"主题汇总失败: {str(e)}"
Install Server

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yzfly/fullscope-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server