The Deepseek Thinker MCP Server enables interaction with Deepseek models for reasoning tasks via the Model Context Protocol. With this server, you can:
- Access Reasoning Processes: Leverage Deepseek's thought processes using the
get-deepseek-thinker
tool by providing a user prompt - Flexible Deployment: Run locally or as a service with configuration through environment variables or JSON
- Dual Operation Modes: Support for both OpenAI API service mode and local Ollama integration
- Client Integration: Seamlessly connect with AI clients like Claude Desktop using MCP configuration
Deepseek Thinker MCP Server
A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.
Core Features
- 🤖 Dual Mode Support
- OpenAI API mode support
- Ollama local mode support
- 🎯 Focused Reasoning
- Captures Deepseek's thinking process
- Provides reasoning output
Available Tools
get-deepseek-thinker
- Description: Perform reasoning using the Deepseek model
- Input Parameters:
originPrompt
(string): User's original prompt
- Returns: Structured text response containing the reasoning process
Environment Configuration
OpenAI API Mode
Set the following environment variables:
Ollama Mode
Set the following environment variable:
Usage
Integration with AI Client, like Claude Desktop
Add the following configuration to your claude_desktop_config.json
:
Using Ollama Mode
Local Server Configuration
Development Setup
FAQ
Response like this: "MCP error -32001: Request timed out"
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
Tech Stack
- TypeScript
- @modelcontextprotocol/sdk
- OpenAI API
- Ollama
- Zod (parameter validation)
License
This project is licensed under the MIT License. See the LICENSE file for details.
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Tools
Provides reasoning content to MCP-enabled AI clients by interfacing with Deepseek's API or a local Ollama server, enabling focused reasoning and thought process visualization.
Related Resources
Related MCP Servers
- AsecurityAlicenseAqualityThis is a Model Context Protocol (MCP) server adaptation of LangChain Ollama Deep Researcher. It provides the deep research capabilities as MCP tools that can be used within the model context protocol ecosystem, allowing AI assistants to perform in-depth research on topics (locally) via OllamaLast updated -313PythonMIT License
- AsecurityAlicenseAqualityAn MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.Last updated -254TypeScriptMIT License
- AsecurityAlicenseAqualityA sophisticated MCP server that provides a multi-dimensional, adaptive reasoning framework for AI assistants, replacing linear reasoning with a graph-based architecture for more nuanced cognitive processes.Last updated -13026TypeScriptMIT License
- -securityFlicense-qualityAn advanced MCP server that implements sophisticated sequential thinking using a coordinated team of specialized AI agents (Planner, Researcher, Analyzer, Critic, Synthesizer) to deeply analyze problems and provide high-quality, structured reasoning.Last updated -216Python