The Deepseek Thinker MCP Server enables interaction with Deepseek models for reasoning tasks via the Model Context Protocol. With this server, you can:
Access Reasoning Processes: Leverage Deepseek's thought processes using the
get-deepseek-thinkertool by providing a user promptFlexible Deployment: Run locally or as a service with configuration through environment variables or JSON
Dual Operation Modes: Support for both OpenAI API service mode and local Ollama integration
Client Integration: Seamlessly connect with AI clients like Claude Desktop using MCP configuration
Deepseek Thinker MCP Server
A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.
Core Features
🤖 Dual Mode Support
OpenAI API mode support
Ollama local mode support
🎯 Focused Reasoning
Captures Deepseek's thinking process
Provides reasoning output
Related MCP server: Multi-Model Advisor
Available Tools
get-deepseek-thinker
Description: Perform reasoning using the Deepseek model
Input Parameters:
originPrompt(string): User's original prompt
Returns: Structured text response containing the reasoning process
Environment Configuration
OpenAI API Mode
Set the following environment variables:
Ollama Mode
Set the following environment variable:
Usage
Integration with AI Client, like Claude Desktop
Add the following configuration to your claude_desktop_config.json:
Using Ollama Mode
Local Server Configuration
Development Setup
FAQ
Response like this: "MCP error -32001: Request timed out"
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
Tech Stack
TypeScript
@modelcontextprotocol/sdk
OpenAI API
Ollama
Zod (parameter validation)
License
This project is licensed under the MIT License. See the LICENSE file for details.