Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| NODE_ENV | No | Environment mode (e.g., development, production). | production |
| LOG_LEVEL | No | Logging level for the application. | info |
| O3_API_KEY | No | Optional O3 API key (defaults to OPENAI_API_KEY if not provided). | |
| MCP_PROTOCOL | No | Transport protocol to use for the server. | stdio |
| OPENAI_API_KEY | No | Optional OpenAI API key for GPT models integration. | |
| DEEPSEEK_API_KEY | Yes | DeepSeek API key required for the primary AI provider. | |
| ANTHROPIC_API_KEY | No | Optional Anthropic API key for Claude models support. | |
| MCP_DISABLE_CACHING | No | Whether to disable the memory and Redis-compatible caching system. | false |
| MCP_DISABLE_METRICS | No | Whether to disable performance monitoring and OpenTelemetry metrics. | false |
| MCP_DEFAULT_PROVIDER | No | Default AI provider to use for requests. | deepseek |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |