Skip to main content
Glama

chat

Process user prompts through Grok's AI models to generate responses, supporting custom system prompts and message storage options.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYes
modelNogrok-4
system_promptNo
store_messagesNo

Implementation Reference

  • The 'chat' tool handler - an async function decorated with @mcp.tool() that creates a chat session with XAI API, optionally adds system prompts, and returns the model response content.
    @mcp.tool() async def chat( prompt: str, model: str = "grok-4", system_prompt: Optional[str] = None, store_messages: bool = False ): client = Client(api_key=XAI_API_KEY) chat = client.chat.create(model=model, store_messages=store_messages) if system_prompt: chat.append(system(system_prompt)) chat.append(user(prompt)) response = chat.sample() client.close() return response.content
  • src/server.py:153-153 (registration)
    Tool registration using the @mcp.tool() decorator, which automatically registers the 'chat' function as an available MCP tool.
    @mcp.tool()
  • Essential imports for the chat functionality including XAI Client, and user/system message builders from xai_sdk.chat module.
    from pathlib import Path from typing import List, Optional from datetime import datetime from mcp.server.fastmcp import FastMCP from xai_sdk import Client from xai_sdk.chat import user, system, image, file from xai_sdk.tools import web_search as xai_web_search, x_search as xai_x_search, code_execution from .utils import encode_image_to_base64, encode_video_to_base64, build_params, XAI_API_KEY

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/merterbak/Grok-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server