Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| LLM_MODEL | No | The LLM model to use | llama2 |
| LOG_LEVEL | No | Logging level (debug, info, warn, error) | info |
| LOG_FORMAT | No | Log format (text or json) | json |
| LLM_TIMEOUT | No | Timeout for LLM requests in milliseconds | 30000 |
| LLM_BASE_URL | No | Base URL for the self-hosted LLM service | http://localhost:11434 |
| SUPABASE_URL | Yes | Your Supabase project URL | |
| MCP_SERVER_HOST | No | Host for the MCP server | localhost |
| MCP_SERVER_PORT | No | Port for the MCP server | 3000 |
| SUPABASE_ANON_KEY | Yes | Your Supabase anonymous key | |
| SUPABASE_SERVICE_ROLE_KEY | Yes | Your Supabase service role key |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |