Skip to main content
Glama

DeepSeek MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
DEEPSEEK_API_KEYYesYour DeepSeek API key

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription
DeepSeek ChatGeneral-purpose chat model optimized for dialogue
DeepSeek ReasonerModel optimized for reasoning and problem-solving
TemperatureControls randomness in the output (0.0 to 2.0)
Maximum TokensMaximum number of tokens to generate
Top PControls diversity via nucleus sampling (0.0 to 1.0)
Frequency PenaltyReduces repetition by penalizing frequent tokens (-2.0 to 2.0)
Presence PenaltyReduces repetition by penalizing used tokens (-2.0 to 2.0)

Tools

Functions exposed to the LLM to take actions

NameDescription
chat_completion-
multi_turn_chat-

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/liuchongchong1995/deepseek-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server