Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| CACHE_DIR | No | Cache directory path | .cache |
| STATE_DIR | No | State directory path | .state |
| MODEL_TOP_P | No | Model top_p parameter (range 0-1) | 1 |
| OPENAI_MODEL | No | OpenAI model to use | gpt-4 |
| OPENAI_API_KEY | Yes | OpenAI API Key (required) | |
| EMBEDDING_MODEL | No | Embedding model to use | text-embedding-3-small |
| OPENAI_BASE_URL | No | OpenAI API base URL | https://api.openai.com/v1 |
| PHABRICATOR_HOST | Yes | Phabricator host URL (required) | |
| MODEL_TEMPERATURE | No | Model temperature parameter (range 0-2) | 0 |
| PHABRICATOR_TOKEN | Yes | Phabricator API token (required) | |
| EMBEDDING_BASE_URL | No | Embedding API base URL (defaults to OPENAI_BASE_URL if not specified) | |
| ALLOW_PUBLISH_COMMENTS | No | Allow publishing comments to Phabricator (set to true to enable) | false |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |