Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| host | No | The host URL of the LLM provider API (e.g., for Ollama) | http://localhost:11434/api/chat |
| openAISecretKey | No | Your OpenAI API secret key when using OpenAI as LLM provider |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |