Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| API_KEY | Yes | API key | |
| API_FORMAT | Yes | API format | |
| HTTP_PROXY | No | HTTP proxy URL | |
| DESCRIPTION | No | Custom model description | |
| HTTPS_PROXY | No | HTTPS proxy URL | |
| API_ENDPOINT | No | Custom endpoint | |
| DEFAULT_MODEL | No | Default model | |
| ANTHROPIC_VERSION | No | Anthropic API version | |
| DEFAULT_MAX_TOKENS | No | Default max tokens setting | |
| DEFAULT_TEMPERATURE | No | Default temperature setting | |
| OPENAI_ORGANIZATION | No | OpenAI organization ID |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| chat_completion | Send a chat completion request to the configured AI API provider (ANTHROPIC). Supports parameters like model, messages, temperature, max_tokens, stream, etc. Returns the raw response from the API without format conversion. Custom AI model for enterprise use |