Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| OLLAMA_HOST | No | Host for Ollama API (alternative to OLLAMA_BASE_URL) | |
| GEMINI_MODEL | No | Gemini model to use | gemini-1.5-flash |
| OLLAMA_MODEL | No | Ollama model to use | llama3.1:8b-instruct |
| OPENAI_MODEL | No | OpenAI model to use | gpt-4o-mini |
| GOOGLE_API_KEY | No | API key for Gemini provider | |
| OPENAI_API_KEY | No | API key for OpenAI provider | |
| ANTHROPIC_MODEL | No | Anthropic model to use | claude-3-haiku-20240307 |
| GEMINI_BASE_URL | No | Optional base URL for Gemini API | |
| OLLAMA_BASE_URL | No | Base URL for Ollama API | |
| OPENAI_BASE_URL | No | Optional base URL for OpenAI API | |
| ANTHROPIC_API_KEY | No | API key for Anthropic provider | |
| ANTHROPIC_BASE_URL | No | Optional base URL for Anthropic API |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |