Skip to main content
Glama

Grok MCP

chat

Process user prompts through Grok MCP to generate responses, configure model parameters, and handle conversation flow with customizable AI settings.

Input Schema

NameRequiredDescriptionDefault
frequency_penaltyNo
max_tokensNo
modelNogrok-4-fast
presence_penaltyNo
promptYes
reasoning_effortNo
stopNo
system_promptNo
temperatureNo
top_pNo
use_conversation_historyNo

Input Schema (JSON Schema)

{ "properties": { "frequency_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Frequency Penalty" }, "max_tokens": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Tokens" }, "model": { "default": "grok-4-fast", "title": "Model", "type": "string" }, "presence_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Presence Penalty" }, "prompt": { "title": "Prompt", "type": "string" }, "reasoning_effort": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Reasoning Effort" }, "stop": { "anyOf": [ { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "title": "Stop" }, "system_prompt": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "System Prompt" }, "temperature": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Temperature" }, "top_p": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Top P" }, "use_conversation_history": { "default": false, "title": "Use Conversation History", "type": "boolean" } }, "required": [ "prompt" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/merterbak/Grok-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server