Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| LOG_LEVEL | No | Log level setting | info |
| GROQ_API_KEY | No | Your Groq API key | |
| GROQ_NICKNAME | No | Optional: defaults to "Groq Duck" | Groq Duck |
| CUSTOM_API_KEY | No | Your custom provider API key | |
| GEMINI_API_KEY | No | Your Google Gemini API key | |
| OPENAI_API_KEY | No | Your OpenAI API key | |
| CUSTOM_BASE_URL | No | Custom provider base URL | |
| CUSTOM_NICKNAME | No | Optional: defaults to "Custom Duck" | Custom Duck |
| GEMINI_NICKNAME | No | Optional: defaults to "Gemini Duck" | Gemini Duck |
| OLLAMA_BASE_URL | No | Ollama base URL | http://localhost:11434/v1 |
| OLLAMA_NICKNAME | No | Optional: defaults to "Local Duck" | Local Duck |
| OPENAI_NICKNAME | No | Optional: defaults to "GPT Duck" | GPT Duck |
| DEFAULT_PROVIDER | No | Default provider to use | openai |
| TOGETHER_API_KEY | No | Your Together AI API key | |
| GROQ_DEFAULT_MODEL | No | Optional: defaults to llama-3.3-70b-versatile | llama-3.3-70b-versatile |
| DEFAULT_TEMPERATURE | No | Default temperature setting | 0.7 |
| CUSTOM_DEFAULT_MODEL | No | Optional: defaults to custom-model | custom-model |
| GEMINI_DEFAULT_MODEL | No | Optional: defaults to gemini-2.5-flash | gemini-2.5-flash |
| OLLAMA_DEFAULT_MODEL | No | Optional: defaults to llama3.2 | llama3.2 |
| OPENAI_DEFAULT_MODEL | No | Optional: defaults to gpt-4o-mini | gpt-4o-mini |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |