Grok MCP is an MCP server that provides comprehensive access to xAI's Grok API capabilities. You can access multiple Grok models (Grok-4, Grok-4-Fast, Grok-3-Mini, and more) for chat completion with extensive customization options including temperature, max tokens, and system prompts. The server supports reasoning models that provide detailed reasoning alongside responses, image generation from text descriptions, and vision analysis of images using natural language queries (supporting both local files and URLs). It offers live web search with real-time results, source citations, date range filters, country localization, and custom RSS feed integration from news, web, X (Twitter), and RSS sources. You can maintain stateful conversations with context preserved across multiple requests, conversation history management, and the ability to retrieve and delete stored responses (kept for 30 days). Additional features include model discovery to list all available Grok models with their details and creation information.
Grok-MCP
A MCP server for xAI's Grok API, providing access to capabilities including image understanding, image generation, live web search, and reasoning models.
π Features
Multiple Grok Models: Access to Grok-4, Grok-4-Fast, Grok-3-Mini, and more
Image Generation: Create images using Grok's image generation models
Vision Capabilities: Analyze images with Grok's vision models
Live Web Search: Real-time web search with citations from news, web, X, and RSS feeds
Reasoning Models: Advanced reasoning with extended thinking models (Grok-3-Mini, Grok-4)
Stateful Conversations: Use this nrewly released feature to maintain conversation context as id across multiple requests
Conversation History: Built-in support for multi-turn conversations
π Prerequisites
Python 3.11 or higher
xAI API key (Get one here)
uv
package manager
π οΈ Installation
Clone the repository:
Install dependencies using
uv
:
π§ Configuration
Claude Desktop Integration
Add this to your Claude Desktop configuration file:
Usage
For stdio:
π Available Tools
1. list_models
List all available Grok models with creation dates and ownership information.
2. chat
Standard chat completion with extensive customization options.
Parameters:
prompt
(required): Your messagemodel
: Model to use (default: "grok-4-fast")system_prompt
: Optional system instructionuse_conversation_history
: Enable multi-turn conversationstemperature
,max_tokens
,top_p
: Generation parameterspresence_penalty
,frequency_penalty
,stop
: Advanced controlreasoning_effort
: For reasoning models ("low" or "high")
3. chat_with_reasoning
Get detailed reasoning along with the response.
Parameters:
prompt
(required): Your question or taskmodel
: "grok-4", "grok-3-mini", or "grok-3-mini-fast"reasoning_effort
: "low" or "high" (not for grok-4)system_prompt
,temperature
,max_tokens
,top_p
Returns: Content, reasoning content, and usage statistics
4. chat_with_vision
Analyze images with natural language queries.
Parameters:
prompt
(required): Your question about the image(s)image_paths
: List of local image file pathsimage_urls
: List of image URLsdetail
: "auto", "low", or "high"model
: Vision-capable model (default: "grok-4-0709")
Supported formats: JPG, JPEG, PNG
5. generate_image
Create images from text descriptions.
Parameters:
prompt
(required): Image descriptionn
: Number of images to generate (default: 1)response_format
: "url" or "b64_json"model
: Image generation model (default: "grok-2-image-1212")
Returns: Generated images and revised prompt
6. live_search
Search the web in real-time with source citations.
Parameters:
prompt
(required): Your search querymodel
: Model to use (default: "grok-4")mode
: "on" or "off"return_citations
: Include source citations (default: true)from_date
,to_date
: Date range (YYYY-MM-DD)max_search_results
: Max results to fetch (default: 20)country
: Country code for localized searchrss_links
: List of RSS feed URLs to searchsources
: Custom source configuration
Returns: Content, citations, usage stats, and number of sources used
7. stateful_chat
Maintain conversation state across multiple requests on xAI servers.
Parameters:
prompt
(required): Your messageresponse_id
: Previous response ID to continue conversationmodel
: Model to use (default: "grok-4")system_prompt
: System instruction (only for new conversations)include_reasoning
: Include reasoning summarytemperature
,max_tokens
Returns: Response with ID for continuing the conversation (stored for 30 days)
8. retrieve_stateful_response
Retrieve a previously stored conversation response.
Parameters:
response_id
(required): The response ID to retrieve
9. delete_stateful_response
Delete a stored conversation from xAI servers.
Parameters:
response_id
(required): The response ID to delete
Roadmap
add docker support
fix chat vision model tool
π License
This project is open source and available under the MIT License.
Tools
Use XAI's latest api functionalities with Grok MCP. It supports image understanding and generation, live search, latest models and more.
Related MCP Servers
- -securityAlicense-qualityModel Context Protocol (MCP) server implementation that enables Claude Desktop to interact with Google's Gemini AI models.Last updated -226MIT License
- -securityFlicense-qualityA Model Context Protocol server that gives Claude access to multiple AI models (Gemini, OpenAI, OpenRouter) for enhanced code analysis, problem-solving, and collaborative development through AI orchestration with conversations that continue across tasks.Last updated -7,453