Skip to main content
Glama

stateful_chat

Maintains conversation context across multiple interactions with Grok AI models to enable coherent, continuous dialogue and follow-up responses.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYes
modelNogrok-4-1-fast-reasoning
response_idNo
system_promptNo

Implementation Reference

  • The implementation of the `stateful_chat` MCP tool, which initializes an xAI client, configures chat parameters, and samples a response.
    @mcp.tool()
    async def stateful_chat(
        prompt: str,
        model: str = "grok-4-1-fast-reasoning",
        response_id: Optional[str] = None,
        system_prompt: Optional[str] = None
    ):
        client = Client(api_key=XAI_API_KEY)
        
        chat_params = {"model": model, "store_messages": True}
        if response_id:
            chat_params["previous_response_id"] = response_id
        
        chat = client.chat.create(**chat_params)
        if system_prompt and not response_id:
            chat.append(system(system_prompt))
        chat.append(user(prompt))
        
        response = chat.sample()
        client.close()

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/merterbak/Grok-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server