Skip to main content
Glama

chat

Process user prompts through Grok AI to generate responses, manage conversation sessions, and customize interactions with system prompts and model selection.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYes
sessionNo
modelNogrok-4
system_promptNo

Implementation Reference

  • Main handler function for the 'chat' tool. Takes a prompt, optional session, model, and system_prompt as parameters. Loads conversation history, creates a Grok chat instance, appends messages to the conversation, gets a response, saves history if session is provided, and returns the response content.
    async def chat(
        prompt: str,
        session: Optional[str] = None,
        model: str = "grok-4",
        system_prompt: Optional[str] = None,
    ):
        history = load_history(session) if session else []
    
        client = Client(api_key=XAI_API_KEY)
        grok = client.chat.create(model=model)
        if system_prompt:
            grok.append(system(system_prompt))
    
        for message in history:
            if message["role"] == "user":
                grok.append(user(message["content"]))
            elif message["role"] == "assistant":
                grok.append(assistant(message["content"]))
    
        grok.append(user(prompt))
        response = grok.sample()
        client.close()
    
        if session:
            history.append({"role": "user", "content": prompt, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")})
            history.append({"role": "assistant", "content": response.content, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")})
            save_history(session, history)
    
        return response.content
  • src/server.py:17-17 (registration)
    Registration of the 'chat' tool using the @mcp.tool() decorator from FastMCP, which registers this async function as an MCP tool endpoint.
    @mcp.tool()
  • Helper function load_history() used by the chat tool to load previous conversation history from a JSON file for the given session.
    def load_history(session: str):
        path = Path("chats") / f"{session}.json"
        if path.exists():
            return json.loads(path.read_text())
        return []
  • Helper function save_history() used by the chat tool to persist conversation history to a JSON file for the given session.
    def save_history(session: str, history: list):
        Path("chats").mkdir(exist_ok=True)
        (Path("chats") / f"{session}.json").write_text(json.dumps(history, indent=2, ensure_ascii=False))
  • Input schema/parameters for the chat tool: prompt (required string), session (optional string), model (string with default 'grok-4'), and system_prompt (optional string). These define the tool's input validation and type definitions.
        prompt: str,
        session: Optional[str] = None,
        model: str = "grok-4",
        system_prompt: Optional[str] = None,
    ):

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/merterbak/Grok-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server