MCP JinaAI Reader Server

by spences10
Verified
# MCP Deepseek Agent MCP server implementation using Ollama's Deepseek model for seamless AI integration. ## Features - ๐Ÿค– MCP protocol compliance - ๐Ÿ”„ Ollama integration with Deepseek model - โš™๏ธ Automatic configuration - ๐Ÿงน Clean responses (removes thinking tags) - ๐Ÿ“ Standard MCP protocol endpoints ## Quick Start 1. Install Ollama and Deepseek model: ```bash ollama run deepseek-r1 ``` 2. Install the package: ```bash pip install git+https://github.com/freebeiro/mcp-deepseek-agent.git ``` 3. Start the server: ```bash mcp-deepseek-agent ``` ## Configuration Set through environment variables: ```bash export OLLAMA_API_URL="http://localhost:11434" export OLLAMA_MODEL="deepseek-r1" export TEMPERATURE="0.7" export TOP_P="0.9" export MCP_PORT="8080" ``` ## Usage in MCP Configuration Add to your MCP configuration: ```json { "mcpServers": { "deepseek": { "command": "mcp-deepseek-agent", "args": [], "env": { "OLLAMA_MODEL": "deepseek-r1" } } } } ``` ## Documentation See [DOCUMENTATION.md](DOCUMENTATION.md) for detailed usage and API documentation. ## License MIT License - see LICENSE file for details.