Provides containerized deployment of the MCP-Stockfish server, making it easier to run in isolated environments without affecting local configurations.
mcp-stockfish 🐟
A Model Context Protocol server that lets your AI talk to Stockfish. Because apparently we needed to make chess engines even more accessible to our silicon overlords.
🧠⚡🖥️ Your LLM thinks, Stockfish calculates, you pretend you understand the resulting 15-move tactical sequence.
What is this?
This creates a bridge between AI systems and the Stockfish chess engine via the MCP protocol. It handles multiple concurrent sessions because your AI probably wants to analyze seventeen positions simultaneously while you're still figuring out why your knight is hanging.
Built on mark3labs/mcp-go. Because reinventing wheels is for people with too much time.
Features
- 🔄 Concurrent Sessions: Run multiple Stockfish instances without your CPU crying
- ⚡ Full UCI Support: All the commands you need, none of the ones you don't
- 🎯 Actually Works: Unlike your last side project, this one has proper error handling
- 📊 JSON Everything: Because apparently we can't just use plain text anymore
- 🐳 Docker Ready: Containerized for when you inevitably break your local setup
Supported UCI Commands ♟️
Command | Description |
---|---|
uci | Initializes the engine in UCI mode |
isready | Checks if the engine is ready. Returns readyok |
position startpos | Sets up the board to the starting position |
position fen [FEN] | Sets up a position using FEN notation |
go | Starts the engine to compute the best move |
go depth [n] | Searches n plies deep. Example: go depth 10 |
go movetime [ms] | Thinks for a fixed amount of time in milliseconds. Example: go movetime 1000 |
stop | Stops current search |
quit | Closes the session |
Quick Start
Installation
Usage
Configuration ⚙️
Environment Variables
Server Configuration
MCP_STOCKFISH_SERVER_MODE
: "stdio" or "http" (default: "stdio")MCP_STOCKFISH_HTTP_HOST
: HTTP host (default: "localhost")MCP_STOCKFISH_HTTP_PORT
: HTTP port (default: 8080)
Stockfish 🐟 Configuration
MCP_STOCKFISH_PATH
: Path to Stockfish binary (default: "stockfish")MCP_STOCKFISH_MAX_SESSIONS
: Max concurrent sessions (default: 10)MCP_STOCKFISH_SESSION_TIMEOUT
: Session timeout (default: "30m")MCP_STOCKFISH_COMMAND_TIMEOUT
: Command timeout (default: "30s")
Logging
MCP_STOCKFISH_LOG_LEVEL
: debug, info, warn, error, fatalMCP_STOCKFISH_LOG_FORMAT
: json, consoleMCP_STOCKFISH_LOG_OUTPUT
: stdout, stderr
Tool Parameters
command
: UCI command to executesession_id
: Session ID (optional, we'll make one up if you don't)
Response Format
Session Management
Sessions do what you'd expect:
- Spawn Stockfish processes on demand
- Keep UCI state between commands
- Clean up when you're done (or when they timeout)
- Enforce limits so you don't fork-bomb yourself
Integration
Claude Desktop
Development
Credits 🐟
Powered by Stockfish, the chess engine that's stronger than both of us combined. Created by people who actually understand chess, unlike this wrapper.
Thanks to:
- The Stockfish team for making chess engines that don't suck
- MCP SDK for Go for handling the protocol so I don't have to
- Coffee
License
MIT - Do whatever you want, just don't blame me when it breaks.
This server cannot be installed
A Model Context Protocol server that lets your AI talk to Stockfish. Because apparently we needed to make chess engines even more accessible to our silicon overlords.
Related MCP Servers
- -securityAlicense-qualityA server based on Model Context Protocol that enables AI assistants to query and search for stock information using the Tushare API.Last updated -PythonMIT License
- AsecurityAlicenseAqualityA Model Context Protocol server that connects to Mattermost, allowing AI models to monitor and process messages from specific teams and channels in real-time via SSE or Standard I/O transport modes.Last updated -12TypeScriptMIT License
- -securityFlicense-qualityA Model Context Protocol server that enables AI models to perform function calls through Feishu/Lark messaging platform, using your personal account (no bot configuration needed) to create a full-featured AI assistant.Last updated -91Python
- -securityFlicense-qualityA Model Context Protocol server that provides AI models with structured access to external data and services, acting as a bridge between AI assistants and applications, databases, and APIs in a standardized, secure way.Last updated -Python