Skip to main content
Glama

Deepseek R1 MCP Server

A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.

Why Node.js? This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.

Quick Start

Installing manually

# Clone and install git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git cd deepseek-r1-mcp npm install # Set up environment cp .env.example .env # Then add your API key # Build and run npm run build

Prerequisites

  • Node.js (v18 or higher)

  • npm

  • Claude Desktop

  • Deepseek API key

Model Selection

By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts:

// For DeepSeek-R1 (default) model: "deepseek-reasoner" // For DeepSeek-V3 model: "deepseek-chat"

Project Structure

deepseek-r1-mcp/ ├── src/ │ ├── index.ts # Main server implementation ├── build/ # Compiled files │ ├── index.js ├── LICENSE ├── README.md ├── package.json ├── package-lock.json └── tsconfig.json

Configuration

  1. Create a .env file:

DEEPSEEK_API_KEY=your-api-key-here
  1. Update Claude Desktop configuration:

{ "mcpServers": { "deepseek_r1": { "command": "node", "args": ["/path/to/deepseek-r1-mcp/build/index.js"], "env": { "DEEPSEEK_API_KEY": "your-api-key" } } } }

Development

npm run dev # Watch mode npm run build # Build for production

Features

  • Advanced text generation with Deepseek R1 (8192 token context window)

  • Configurable parameters (max_tokens, temperature)

  • Robust error handling with detailed error messages

  • Full MCP protocol support

  • Claude Desktop integration

  • Support for both DeepSeek-R1 and DeepSeek-V3 models

API Usage

{ "name": "deepseek_r1", "arguments": { "prompt": "Your prompt here", "max_tokens": 8192, // Maximum tokens to generate "temperature": 0.2 // Controls randomness } }

The Temperature Parameter

The default value of temperature is 0.2.

Deepseek recommends setting the temperature according to your specific use case:

USE CASE

TEMPERATURE

EXAMPLE

Coding / Math

0.0

Code generation, mathematical calculations

Data Cleaning / Data Analysis

1.0

Data processing tasks

General Conversation

1.3

Chat and dialogue

Translation

1.3

Language translation

Creative Writing / Poetry

1.5

Story writing, poetry generation

Error Handling

The server provides detailed error messages for common issues:

  • API authentication errors

  • Invalid parameters

  • Rate limiting

  • Network issues

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

Deploy Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Related MCP Servers

  • -
    security
    A
    license
    -
    quality
    A Model Control Protocol server implementation that allows Claude Desktop to use Deepseek models running in Docker, enabling seamless integration between Claude Desktop and Deepseek's language models.
    Last updated -
    4
    MIT License
  • A
    security
    A
    license
    A
    quality
    A Model Context Protocol server that provides DuckDuckGo search functionality for Claude, enabling web search capabilities through a clean tool interface with rate limiting support.
    Last updated -
    22
    1
    274
    65
    MIT License
    • Apple
  • A
    security
    F
    license
    A
    quality
    A Model Context Protocol server that enhances Claude in Cursor AI with advanced reasoning capabilities including Monte Carlo Tree Search, Beam Search, R1 Transformer, and Hybrid Reasoning methods.
    Last updated -
    8
    13
  • A
    security
    A
    license
    A
    quality
    A Model Context Protocol server that enables Claude to perform advanced web research with intelligent search queuing, enhanced content extraction, and deep research capabilities.
    Last updated -
    3
    6
    1
    MIT License
    • Apple

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/66julienmartin/MCP-server-Deepseek_R1'

If you have feedback or need assistance with the MCP directory API, please join our Discord server