rlm_sub_query
Process large datasets by executing targeted sub-queries on specific context chunks using recursive LLM calls for strategic analysis beyond standard prompt limits.
Instructions
Make a sub-LLM call on a chunk or filtered context. Core of recursive pattern.
Args: query: Question/instruction for the sub-call context_name: Context identifier to query against chunk_index: Optional: specific chunk index provider: LLM provider - 'auto', 'ollama', or 'claude-sdk'. 'auto' prefers Ollama if available (free local inference) model: Model to use (provider-specific defaults apply)
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| query | Yes | ||
| context_name | Yes | ||
| chunk_index | No | ||
| provider | No | auto | |
| model | No |