MCP Deepseek Agent Server

MCP Deepseek Agent

MCP server implementation using Ollama's Deepseek model for seamless AI integration.

Features

  • ๐Ÿค– MCP protocol compliance
  • ๐Ÿ”„ Ollama integration with Deepseek model
  • โš™๏ธ Automatic configuration
  • ๐Ÿงน Clean responses (removes thinking tags)
  • ๐Ÿ“ Standard MCP protocol endpoints

Quick Start

  1. Install Ollama and Deepseek model:
ollama run deepseek-r1
  1. Install the package:
pip install git+https://github.com/freebeiro/mcp-deepseek-agent.git
  1. Start the server:
mcp-deepseek-agent

Configuration

Set through environment variables:

export OLLAMA_API_URL="http://localhost:11434" export OLLAMA_MODEL="deepseek-r1" export TEMPERATURE="0.7" export TOP_P="0.9" export MCP_PORT="8080"

Usage in MCP Configuration

Add to your MCP configuration:

{ "mcpServers": { "deepseek": { "command": "mcp-deepseek-agent", "args": [], "env": { "OLLAMA_MODEL": "deepseek-r1" } } } }

Documentation

See DOCUMENTATION.md for detailed usage and API documentation.

License

MIT License - see LICENSE file for details.

-
security - not tested
F
license - not found
-
quality - not tested

Enables seamless AI integration via Ollama's Deepseek model, providing protocol compliance and automatic configuration for clean AI-driven interactions.

  1. Features
    1. Quick Start
      1. Configuration
        1. Usage in MCP Configuration
          1. Documentation
            1. License