Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
GOOGLE_API_KEY | No | Your Google AI API key | |
OPENAI_API_KEY | No | Your OpenAI API key | |
OLLAMA_BASE_URL | No | Ollama instance URL | http://localhost:11434 |
DEEPSEEK_API_KEY | No | Your DeepSeek API key | |
ANTHROPIC_API_KEY | No | Your Anthropic API key | |
OPENROUTER_API_KEY | No | Your OpenRouter API key | |
OPENAI_COMPATIBLE_API_KEY | No | API key for OpenAI-compatible services | |
OPENAI_COMPATIBLE_API_MODELS | No | Comma-separated list of available models for OpenAI-compatible services | |
OPENAI_COMPATIBLE_API_BASE_URL | No | Base URL for OpenAI-compatible services |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
getSecondOpinion | Get responses from various LLM providers |
listProviders | List all configured LLM providers and their available models |
listReasoningModels | List all available models that support reasoning capabilities |