query_corpus
Answer natural-language questions by querying a saved corpus. Returns the AI response or the assembled prompt when mode is prompt-only.
Instructions
Answer a natural-language question against a saved corpus. Loads the corpus body, primes the configured AI provider with it as system context, and returns the response. When mode="prompt-only" returns the assembled system+user prompt instead of calling the AI — useful for users without an AI provider configured, or for piping into another LLM. Returns JSON: { answer | prompt, corpus, tokens_used }.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | Corpus slug — alphanumeric + dash + underscore, ≤64 chars, must start with a letter or digit | |
| question | Yes | Natural-language question | |
| mode | No | answer (default): call the AI provider and return its reply; prompt-only: return the assembled prompt without calling the AI | |
| max_tokens | No | Cap on the AI response length (default 1024) | |
| temperature | No | Sampling temperature for the AI call (default 0.2) |