DeepSeek MCP Server
DeepSeek MCP Server
A Model Context Protocol (MCP) server for the DeepSeek API, allowing seamless integration of DeepSeek's powerful language models with MCP-compatible applications like Claude Desktop.
<a href="https://glama.ai/mcp/servers/asht4rqltn"><img width="380" height="200" src="https://glama.ai/mcp/servers/asht4rqltn/badge" alt="DeepSeek Server MCP server" /></a>
Installation
Configuration
- Get your DeepSeek API key from DeepSeek Platform
- Set up your environment:Or create aCopyexport DEEPSEEK_API_KEY=your-api-key
.env
file:CopyDEEPSEEK_API_KEY=your-api-key
Usage with Claude Desktop
Add this to your claude_desktop_config.json
:
Features
- Chat completion tool with support for:
- Custom model selection
- Temperature control
- Max tokens limit
- Top P sampling
- Presence penalty
- Frequency penalty
Testing with MCP Inspector
You can test the server locally using the MCP Inspector tool:
- Build the server:Copynpm run build
- Run the server with MCP Inspector:Copy# Make sure to specify the full path to the built server npx @modelcontextprotocol/inspector node ./build/index.js
The inspector will open in your browser and connect to the server via stdio transport. You can:
- View available tools
- Test chat completions with different parameters
- Debug server responses
- Monitor server performance
Note: The server uses DeepSeek's R1 model (deepseek-reasoner) by default, which provides state-of-the-art performance for reasoning and general tasks.
License
MIT
You must be authenticated.
Enables integration of DeepSeek's language models with MCP-compatible applications, offering features like chat completion, custom model selection, and parameter control for enhancing language-based interactions.