Supports environment configuration through .env files to store API keys and model configuration settings.
Deepseek-Thinking-Claude-3.5-Sonnet-CLINE-MCP
A Model Context Protocol (MCP) server that combines DeepSeek R1's reasoning capabilities with Claude 3.5 Sonnet's response generation through OpenRouter. This implementation uses a two-stage process where DeepSeek provides structured reasoning which is then incorporated into Claude's response generation.
Features
- Two-Stage Processing:
- Uses DeepSeek R1 for initial reasoning (50k character context)
- Uses Claude 3.5 Sonnet for final response (600k character context)
- Both models accessed through OpenRouter's unified API
- Injects DeepSeek's reasoning tokens into Claude's context
- Smart Conversation Management:
- Detects active conversations using file modification times
- Handles multiple concurrent conversations
- Filters out ended conversations automatically
- Supports context clearing when needed
- Optimized Parameters:
- Model-specific context limits:
- DeepSeek: 50,000 characters for focused reasoning
- Claude: 600,000 characters for comprehensive responses
- Recommended settings:
- temperature: 0.7 for balanced creativity
- top_p: 1.0 for full probability distribution
- repetition_penalty: 1.0 to prevent repetition
- Model-specific context limits:
Installation
Installing via Smithery
To install DeepSeek Thinking with Claude 3.5 Sonnet for Claude Desktop automatically via Smithery:
Manual Installation
- Clone the repository:
- Install dependencies:
- Create a
.env
file with your OpenRouter API key:
- Build the server:
Usage with Cline
Add to your Cline MCP settings (usually in ~/.vscode/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
):
Tool Usage
The server provides two tools for generating and monitoring responses:
generate_response
Main tool for generating responses with the following parameters:
check_response_status
Tool for checking the status of a response generation task:
Response Polling
The server uses a polling mechanism to handle long-running requests:
- Initial Request:
generate_response
returns immediately with a task ID- Response format:
{"taskId": "uuid-here"}
- Status Checking:
- Use
check_response_status
to poll the task status - Note: Responses can take up to 60 seconds to complete
- Status progresses through: pending → reasoning → responding → complete
- Use
Example usage in Cline:
Development
For development with auto-rebuild:
How It Works
- Reasoning Stage (DeepSeek R1):
- Uses OpenRouter's reasoning tokens feature
- Prompt is modified to output 'done' while capturing reasoning
- Reasoning is extracted from response metadata
- Response Stage (Claude 3.5 Sonnet):
- Receives the original prompt and DeepSeek's reasoning
- Generates final response incorporating the reasoning
- Maintains conversation context and history
License
MIT License - See LICENSE file for details.
Credits
Based on the RAT (Retrieval Augmented Thinking) concept by Skirano, which enhances AI responses through structured reasoning and knowledge retrieval.
This implementation specifically combines DeepSeek R1's reasoning capabilities with Claude 3.5 Sonnet's response generation through OpenRouter's unified API.
You must be authenticated.
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A Model Context Protocol server that combines DeepSeek R1's reasoning capabilities with Claude 3.5 Sonnet's response generation, enabling two-stage AI processing where DeepSeek's structured reasoning enhances Claude's final outputs.
Related Resources
Related MCP Servers
- AsecurityAlicenseAqualityFacilitates two-stage reasoning processes using DeepSeek for detailed analysis and supports multiple response models such as Claude 3.5 Sonnet and OpenRouter, maintaining conversation context and enhancing AI-driven interactions.Last updated -2101JavaScriptMIT License
- AsecurityAlicenseAqualityA Node.js/TypeScript implementation of a Model Context Protocol server for the Deepseek R1 language model, optimized for reasoning tasks with a large context window and fully integrated with Claude Desktop.Last updated -140JavaScriptMIT License
- -securityFlicense-qualityThis server integrates DeepSeek and Claude AI models to provide enhanced AI responses, featuring a RESTful API, configurable parameters, and robust error handling.Last updated -13TypeScript
- AsecurityAlicenseAqualityA server that enhances Claude's reasoning capabilities by integrating DeepSeek R1's advanced reasoning engine to tackle complex reasoning tasks.Last updated -1PythonMIT License