Skip to main content
Glama
egoughnour
by egoughnour

rlm_auto_analyze

Automatically analyzes large content by detecting its type and applying optimal chunking strategies for tasks like summarization, bug detection, structure extraction, security audits, or custom queries.

Instructions

Automatically detect content type and analyze with optimal chunking strategy.

One-step analysis for common tasks.

Args: name: Context identifier content: The content to analyze goal: Analysis goal: 'summarize', 'find_bugs', 'extract_structure', 'security_audit', or 'answer:' provider: LLM provider - 'auto' prefers Ollama if available concurrency: Max parallel requests (default 4, max 8)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes
contentYes
goalYes
providerNoauto
concurrencyNo

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/egoughnour/massive-context-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server