Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
OLLAMA_MODEL | No | Ollama model to use | llama3.2:latest |
GEMINI_API_KEY | No | Your Gemini API key | |
OPENAI_API_KEY | No | Your OpenAI API key | |
OLLAMA_BASE_URL | No | Base URL for Ollama configuration (for local AI) | http://localhost:11434 |
ANTHROPIC_API_KEY | No | Your Claude API key |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |