Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| OAI_BASE | No | The base URL for the OpenAI-compatible API | http://localhost:11434/v1 |
| OAI_API_KEY | No | Optional API key (some backends ignore it, Ollama allows any value) | |
| OAI_TIMEOUT_MS | No | Optional request timeout in milliseconds | |
| OAI_DEFAULT_MODEL | No | Fallback model name (e.g. llama3:latest) |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| openai_models_list | List models from OpenAI-compatible backend (GET /v1/models). |
| openai_chat_completions | Create chat completion (POST /v1/chat/completions). |
| openai_embeddings_create | Create embeddings (POST /v1/embeddings). |