Skip to main content
Glama

Deepseek Thinker MCP Server

by ruixingshi

Deepseek Thinker MCP 服务器

MCP(模型上下文协议)将 Deepseek 推理内容提供给支持 MCP 的 AI 客户端,例如 Claude Desktop。支持从 Deepseek API 服务或本地 Ollama 服务器访问 Deepseek 的思维过程。

核心功能

  • 🤖双模式支持
    • OpenAI API 模式支持
    • Ollama 本地模式支持
  • 🎯专注推理
    • 捕捉 Deepseek 的思考过程
    • 提供推理输出

可用工具

深入探索思想者

  • 描述:使用Deepseek模型进行推理
  • 输入参数
    • originPrompt (字符串): 用户的原始提示
  • 返回:包含推理过程的结构化文本响应

环境配置

OpenAI API 模式

设置以下环境变量:

API_KEY=<Your OpenAI API Key> BASE_URL=<API Base URL>

奥拉玛模式

设置以下环境变量:

USE_OLLAMA=true

用法

与 AI Client 集成,例如 Claude Desktop

将以下配置添加到您的claude_desktop_config.json

{ "mcpServers": { "deepseek-thinker": { "command": "npx", "args": [ "-y", "deepseek-thinker-mcp" ], "env": { "API_KEY": "<Your API Key>", "BASE_URL": "<Your Base URL>" } } } }

使用 Ollama 模式

{ "mcpServers": { "deepseek-thinker": { "command": "npx", "args": [ "-y", "deepseek-thinker-mcp" ], "env": { "USE_OLLAMA": "true" } } } }

本地服务器配置

{ "mcpServers": { "deepseek-thinker": { "command": "node", "args": [ "/your-path/deepseek-thinker-mcp/build/index.js" ], "env": { "API_KEY": "<Your API Key>", "BASE_URL": "<Your Base URL>" } } } }

开发设置

# Install dependencies npm install # Build project npm run build # Run service node build/index.js

常问问题

响应如下:“MCP 错误 -32001:请求超时”

当Deepseek API响应过慢或者推理内容输出过长导致MCP服务器超时时,就会出现此错误。

技术栈

  • TypeScript
  • @modelcontextprotocol/sdk
  • OpenAI API
  • 奥拉马
  • Zod(参数验证)

执照

本项目遵循 MIT 许可证。详情请参阅 LICENSE 文件。

You must be authenticated.

A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

通过与 Deepseek 的 API 或本地 Ollama 服务器交互,为支持 MCP 的 AI 客户端提供推理内容,实现集中推理和思维过程可视化。

  1. 核心功能
    1. 可用工具
      1. 深入探索思想者
    2. 环境配置
      1. OpenAI API 模式
      2. 奥拉玛模式
    3. 用法
      1. 与 AI Client 集成,例如 Claude Desktop
      2. 使用 Ollama 模式
      3. 本地服务器配置
    4. 开发设置
      1. 常问问题
        1. 响应如下:“MCP 错误 -32001:请求超时”
      2. 技术栈
        1. 执照

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            A minimal MCP Server that provides Claude AI models with the 'think' tool capability, enabling better performance on complex reasoning tasks by allowing the model to pause during response generation for additional thinking steps.
            Last updated -
            525
            1
            TypeScript
            MIT License
            • Apple
          • A
            security
            A
            license
            A
            quality
            AoT MCP server enables AI models to solve complex reasoning problems by decomposing them into independent, reusable atomic units of thought, featuring a powerful decomposition-contraction mechanism that allows for deep exploration of problem spaces while maintaining high confidence in conclusions.
            Last updated -
            3
            25
            JavaScript
            MIT License
            • Apple
            • Linux
          • A
            security
            A
            license
            A
            quality
            A sophisticated MCP server that provides a multi-dimensional, adaptive reasoning framework for AI assistants, replacing linear reasoning with a graph-based architecture for more nuanced cognitive processes.
            Last updated -
            1
            174
            13
            TypeScript
            MIT License
            • Apple
            • Linux
          • -
            security
            A
            license
            -
            quality
            An MCP server that connects to Pollinations.ai API, enabling AI models to generate and download images and text through natural language commands.
            Last updated -
            2
            JavaScript
            Apache 2.0
            • Apple
            • Linux

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/ruixingshi/deepseek-thinker-mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server