Skip to main content
Glama

chat

Process prompts through the Grok API to generate responses, manage sessions, and configure models for AI-powered conversations.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYes
sessionNo
modelNogrok-4-1-fast-reasoning
system_promptNo

Implementation Reference

  • The 'chat' tool implementation. It handles chat history, communicates with the xAI Client, manages model interaction, and persists chat history if a session is provided.
    @mcp.tool()
    async def chat(
        prompt: str,
        session: Optional[str] = None,
        model: str = "grok-4-1-fast-reasoning",
        system_prompt: Optional[str] = None,
    ):
        history = load_history(session) if session else []
    
        client = Client(api_key=XAI_API_KEY)
        grok = client.chat.create(model=model)
        if system_prompt:
            grok.append(system(system_prompt))
    
        for message in history:
            if message["role"] == "user":
                grok.append(user(message["content"]))
            elif message["role"] == "assistant":
                grok.append(assistant(message["content"]))
    
        grok.append(user(prompt))
        response = grok.sample()
        client.close()
    
        if session:
            history.append({"role": "user", "content": prompt, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")})
            history.append({"role": "assistant", "content": response.content, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")})
            save_history(session, history)
    
        return response.content

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/merterbak/Grok-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server