Skip to main content
Glama

Grok MCP

chat

Get AI assistance for general questions, creative writing, coding help, and text tasks using Grok models with optional conversation history and customizable parameters.

Instructions

Basic chat completion with Grok models - your standard conversational AI tool. Use this for general questions, creative writing, coding help, or any text task. You can optionally keep conversation history to maintain context across multiple exchanges. For reasoning models, use the reasoning_effort parameter. For other models, you have more control with penalties and stop sequences. Args: prompt: What you want to ask or have the AI do model: Which Grok model to use (default is grok-4-fast) system_prompt: Instructions for how the AI should behave (only used at start) use_conversation_history: Keep context between messages (default False) temperature: Creativity level 0-2 (higher = more creative) max_tokens: Maximum length of response top_p: Alternative to temperature for controlling randomness presence_penalty: Penalize talking about same topics (-2.0 to 2.0) frequency_penalty: Penalize repeating the same words (-2.0 to 2.0) stop: List of sequences where the AI should stop generating reasoning_effort: "low" or "high" for reasoning models only (grok-3-mini) Returns the AI's response as a string.

Input Schema

NameRequiredDescriptionDefault
frequency_penaltyNo
max_tokensNo
modelNogrok-4-fast
presence_penaltyNo
promptYes
reasoning_effortNo
stopNo
system_promptNo
temperatureNo
top_pNo
use_conversation_historyNo

Input Schema (JSON Schema)

{ "properties": { "frequency_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Frequency Penalty" }, "max_tokens": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Tokens" }, "model": { "default": "grok-4-fast", "title": "Model", "type": "string" }, "presence_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Presence Penalty" }, "prompt": { "title": "Prompt", "type": "string" }, "reasoning_effort": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Reasoning Effort" }, "stop": { "anyOf": [ { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "title": "Stop" }, "system_prompt": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "System Prompt" }, "temperature": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Temperature" }, "top_p": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Top P" }, "use_conversation_history": { "default": false, "title": "Use Conversation History", "type": "boolean" } }, "required": [ "prompt" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/merterbak/Grok-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server