Skip to main content
Glama
LZMW

Aurai Advisor (上级顾问 MCP)

by LZMW

get_status

Retrieve current conversation status including iteration count, configuration details, and AI provider information for programming problem-solving sessions.

Instructions

获取当前状态

返回当前对话状态、迭代次数、配置信息等。


返回内容:conversation_history_count(对话历史数量)、max_iterations(最大迭代次数)、max_history(最大历史条数)、provider(AI提供商)、model(模型名称)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The get_status tool handler function - an async function decorated with @mcp.tool() that returns current conversation status including history count, max iterations, max history, provider, and model information.
    @mcp.tool()
    async def get_status() -> dict[str, Any]:
        """
        获取当前状态
    
        返回当前对话状态、迭代次数、配置信息等。
    
        ---
        **返回内容**:conversation_history_count(对话历史数量)、max_iterations(最大迭代次数)、max_history(最大历史条数)、provider(AI提供商)、model(模型名称)
        """
        return {
            "conversation_history_count": len(_conversation_history),
            "max_iterations": get_aurai_config().max_iterations,
            "max_history": server_config.max_history,
            "provider": get_aurai_config().provider,
            "model": get_aurai_config().model,
        }
  • The @mcp.tool() decorator that registers the get_status function as an MCP tool with the FastMCP framework.
    @mcp.tool()
  • AuraiConfig Pydantic model that defines the schema for AI provider configuration (max_iterations, provider, model) returned by get_status.
    class AuraiConfig(BaseModel):
        """上级AI配置"""
    
        # API提供商(固定为 custom,使用 OpenAI 兼容 API)
        provider: Literal["custom"] = Field(
            default="custom",
            description="AI服务提供商(固定使用自定义 OpenAI 兼容 API)"
        )
    
        # API密钥
        api_key: str = Field(
            default_factory=lambda: os.getenv("AURAI_API_KEY", ""),
            description="API密钥"
        )
    
        # API基础URL(可选,用于代理或自定义端点)
        base_url: str | None = Field(
            default_factory=lambda: os.getenv("AURAI_BASE_URL"),
            description="API基础URL"
        )
    
        # 模型名称
        model: str = Field(
            default_factory=lambda: os.getenv("AURAI_MODEL", "gpt-4o"),
            description="模型名称"
        )
    
        # 上下文窗口大小(tokens)- 默认基于 GLM-4.7,可通过环境变量覆盖
        context_window: int = Field(
            default_factory=lambda: int(os.getenv("AURAI_CONTEXT_WINDOW", str(DEFAULT_CONTEXT_WINDOW))),
            ge=1,
            description="模型上下文窗口大小(默认:200,000,基于 GLM-4.7)"
        )
    
        # 单条消息最大 tokens - 默认基于 GLM-4.7,可通过环境变量覆盖
        max_message_tokens: int = Field(
            default_factory=lambda: int(os.getenv("AURAI_MAX_MESSAGE_TOKENS", str(DEFAULT_MAX_MESSAGE_TOKENS))),
            ge=1,
            description="单条消息最大 tokens(默认:150,000,基于 GLM-4.7 优化)"
        )
    
        # 最大迭代次数
        max_iterations: int = Field(
            default=10,
            description="最大迭代次数"
        )
    
        # 温度参数
        temperature: float = Field(
            default=0.7,
            ge=0.0,
            le=2.0,
            description="温度参数"
        )
    
        # 最大生成 tokens - 默认基于 GLM-4.7,可通过环境变量覆盖
        max_tokens: int = Field(
            default_factory=lambda: int(os.getenv("AURAI_MAX_TOKENS", str(DEFAULT_MAX_TOKENS))),
            ge=1,
            description="最大生成 tokens(默认:32,000,基于 GLM-4.7 优化)"
        )
    
        @field_validator('api_key')
        @classmethod
        def validate_api_key(cls, v: str) -> str:
            """验证API密钥格式"""
            if not v or not v.strip():
                raise ValueError("API密钥不能为空")
    
            v = v.strip()
    
            # 基本长度验证(大多数API密钥至少20个字符)
            if len(v) < 10:
                raise ValueError("API密钥长度不能少于10个字符")
    
            # 基本格式验证(不能包含空格或特殊控制字符)
            if re.search(r'[\s\n\r\t]', v):
                raise ValueError("API密钥不能包含空格或控制字符")
    
            return v
  • ServerConfig Pydantic model that defines the schema for server configuration (max_history) returned by get_status.
    class ServerConfig(BaseModel):
        """服务器配置"""
    
        # 服务器名称
        name: str = "Aurai Advisor"
    
        # 日志级别
        log_level: Literal["DEBUG", "INFO", "WARNING", "ERROR"] = "INFO"
    
        # 对话历史最大保存数
        max_history: int = Field(
            default_factory=lambda: int(os.getenv("AURAI_MAX_HISTORY", "50")),
            ge=1,
            le=200,
            description="对话历史最大保存数"
        )
    
        # 启用对话历史持久化
        enable_persistence: bool = Field(
            default_factory=lambda: os.getenv("AURAI_ENABLE_PERSISTENCE", "true").lower() == "true",
            description="是否启用对话历史持久化到文件"
        )
    
        # 对话历史文件路径(固定在用户目录)
        history_path: str = Field(
            default_factory=lambda: os.getenv(
                "AURAI_HISTORY_PATH",
                str(Path.home() / ".mcp-aurai" / "history.json")
            ),
            description="对话历史文件路径"
        )

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/LZMW/mcp-aurai-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server