Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
CUSTOM_API_URL | No | URL for local models (e.g., http://localhost:11434/v1 for Ollama) | |
GEMINI_API_KEY | No | Your Gemini API key from Google AI Studio | |
OPENAI_API_KEY | No | Your OpenAI API key from OpenAI Platform | |
CUSTOM_MODEL_NAME | No | Model name for local models (e.g., llama3.2) | |
OPENROUTER_API_KEY | No | Your OpenRouter API key for access to 100+ models |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |