Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| LLM_PROVIDER | No | LLM Provider Selection - 'openai' or 'lmstudio' | openai |
| OPENAI_MODEL | No | OpenAI model to use | gpt-4 |
| ACE_LOG_LEVEL | No | Logging level | info |
| LMSTUDIO_MODEL | No | LM Studio model name | |
| OPENAI_API_KEY | No | OpenAI API key | |
| ACE_CONTEXT_DIR | No | Storage directory for contexts | ./contexts |
| LMSTUDIO_BASE_URL | No | LM Studio base URL | http://localhost:1234/v1 |
| ACE_DEDUP_THRESHOLD | No | Similarity threshold for deduplication (0-1) | 0.85 |
| ACE_MAX_PLAYBOOK_SIZE | No | Maximum bullets per context | 1000 |
| OPENAI_EMBEDDING_MODEL | No | OpenAI embedding model to use | text-embedding-3-small |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |