Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
LLM_MODEL | No | Model identifier sent to the API | open/ai-gpt-oss-20b |
LOG_LEVEL | No | Log verbosity (error|warn|info|debug) | info |
LLM_API_KEY | No | Bearer token for the API | |
LLM_API_BASE | No | OpenAI-compatible base URL | http://localhost:1234/v1 |
LLM_BACKOFF_MS | No | Initial backoff delay in milliseconds | 250 |
LLM_TIMEOUT_MS | No | Request timeout | 60000 |
LLM_MAX_RETRIES | No | Retry count for retryable HTTP/network errors | 1 |
ENFORCE_LOCAL_API | No | If true, only allow localhost APIs | false |
LLM_BACKOFF_JITTER | No | Jitter factor applied to backoff (0..1) | 0.2 |
RETOUCH_CONTENT_MAX_RETRIES | No | Retries when the cleaner returns non-JSON content | 1 |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
cleaner | Pre-reasoning prompt normalizer and PII redactor. Use when: you receive raw/free-form user text and need it cleaned before planning, tool selection, or code execution. Does: normalize tone, structure the ask, and redact secrets; preserves user intent. Safe: read-only, idempotent, no side effects (good default to run automatically). Input: { prompt, mode?, temperature? } — defaults mode='general', temperature=0.2; mode='code' only for code-related prompts. Output: JSON { retouched, notes?, openQuestions?, risks?, redactions? }. Keywords: clean, sanitize, normalize, redact, structure, preprocess, guardrails |
sanitize-text | Alias of cleaner. Keywords: sanitize, scrub, redact, filter, pii, normalize, preprocess. Same input/output schema as 'cleaner'. |
normalize-prompt | Alias of cleaner. Keywords: normalize, restructure, clarify, tighten, format, preflight. Same input/output schema as 'cleaner'. |
health-ping | Liveness probe; returns { ok: true } |