Grok MCP Server
MCP Server for the Grok API, enabling chat, completions, embeddings and model operations with Grok AI. It is implemented using FastMCP for quick setup and tool registration. By default the server exposes an HTTP streaming endpoint on port 8080
.
Features
Multiple Operation Types: Support for chat completions, text completions, embeddings, and model management
Comprehensive Error Handling: Clear error messages for common issues
Streaming Support: Real-time streaming responses for chat and completions
Multi-modal Inputs: Support for both text and image inputs in chat conversations
VSCode Integration: Seamless integration with Visual Studio Code
Tools
list_models
List available models for the API
Returns: Array of available models with details
get_model
Get information about a specific model
Inputs:
model_id
(string): The ID of the model to retrieve
Returns: Model details
create_chat_completion
Create a chat completion with Grok
Inputs:
model
(string): ID of the model to usemessages
(array): Chat messages, each withrole
,content
temperature
(optional number): Sampling temperaturetop_p
(optional number): Nucleus sampling parameter
n
(optional number): Number of completions to generatemax_tokens
(optional number): Maximum tokens to generatestream
(optional boolean): Whether to stream responseslogit_bias
(optional object): Map of token IDs to bias scoresresponse_format
(optional object):{ type: "json_object" | "text" }
seed
(optional number): Seed for deterministic sampling
Returns: Generated chat completion response
create_completion
Create a text completion with Grok
Inputs:
model
(string): ID of the model to useprompt
(string): Text prompt to complete
temperature
(optional number): Sampling temperaturemax_tokens
(optional number): Maximum tokens to generatestream
(optional boolean): Whether to stream responseslogit_bias
(optional object): Map of token IDs to bias scoresseed
(optional number): Seed for deterministic sampling
Returns: Generated text completion response
create_embeddings
Create embeddings from input text
Inputs:
model
(string): ID of the model to useinput
(string or array): Text to embedencoding_format
(optional string): Format of the embeddings
Returns: Vector embeddings of the input text
Setup
Grok API Key
To use this server, you'll need a Grok API key:
Obtain a Grok API key from x.ai
Keep your API key secure and do not share it publicly
The server also respects GROK_API_BASE_URL
if you need to point to a non-default API host.
Build
Build the project from source (optional for generating JavaScript output):
npm start
runs the server with ts-node
.
The HTTP server listens on http://localhost:8080/stream
.
Development
For development with automatic rebuilding on file changes:
License
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Enables interaction with the Grok AI through an MCP server, supporting chat completions, text completions, embeddings, and model operations with streaming capabilities.
Related MCP Servers
- -securityAlicense-qualityProvides seamless access to Grok AI's capabilities (chat completion, image understanding, and function calling) directly from Cline via the Model Context Protocol.Last updated -32715MIT License
- AsecurityAlicenseAqualityA powerful MCP server that provides interactive user feedback and command execution capabilities for AI-assisted development, featuring a graphical interface with text and image support.Last updated -139MIT License
- -securityFlicense-qualityAn MCP server that provides user dialogue capabilities for AI code editors, allowing AI to interact with users through dialog boxes when needing input during the coding process.
- -securityAlicense-qualityAn MCP server that provides deep knowledge about OpenAI APIs and SDKs, enabling users to query technical information through various MCP clients including ChatGPT Deep Research, Cursor, and OpenAI Responses API.Last updated -11MIT License