Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
DEBUG | No | Enable debug mode | false |
GEMINI_MODEL | No | The Gemini model to use | gemini-2.0-flash |
GEMINI_TOP_K | No | Top-K setting for the Gemini model | 40 |
GEMINI_TOP_P | No | Top-P setting for the Gemini model | 0.9 |
MAX_SESSIONS | No | Maximum number of sessions | 50 |
GEMINI_API_KEY | Yes | Your Gemini API key | |
GEMINI_TEMPERATURE | No | Temperature setting for the Gemini model | 0.7 |
MAX_MESSAGE_LENGTH | No | Maximum message length | 1000000 |
MAX_TOKENS_PER_SESSION | No | Maximum tokens per session | 2097152 |
SESSION_TIMEOUT_MINUTES | No | Session timeout in minutes | 120 |
GEMINI_MAX_OUTPUT_TOKENS | No | Maximum number of output tokens | 2097152 |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |