chat
Process prompts through the Grok API to generate responses, manage sessions, and configure models for AI-powered conversations.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| prompt | Yes | ||
| session | No | ||
| model | No | grok-4-1-fast-reasoning | |
| system_prompt | No |
Implementation Reference
- src/server.py:19-48 (handler)The 'chat' tool implementation. It handles chat history, communicates with the xAI Client, manages model interaction, and persists chat history if a session is provided.
@mcp.tool() async def chat( prompt: str, session: Optional[str] = None, model: str = "grok-4-1-fast-reasoning", system_prompt: Optional[str] = None, ): history = load_history(session) if session else [] client = Client(api_key=XAI_API_KEY) grok = client.chat.create(model=model) if system_prompt: grok.append(system(system_prompt)) for message in history: if message["role"] == "user": grok.append(user(message["content"])) elif message["role"] == "assistant": grok.append(assistant(message["content"])) grok.append(user(prompt)) response = grok.sample() client.close() if session: history.append({"role": "user", "content": prompt, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")}) history.append({"role": "assistant", "content": response.content, "time": datetime.now().strftime("%d.%m.%Y %H:%M:%S")}) save_history(session, history) return response.content