Skip to main content
Glama

process_large_pdf

Break down large PDF files into smaller chunks to manage memory constraints. Define page range, chunk size, and output summary for efficient processing of PDFs over 50MB.

Instructions

Process a large PDF file in smaller chunks to handle memory constraints. Use this for PDFs over 50MB.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
chunkSizePagesNoNumber of pages to process at a time (default: 20)
endPageNoEnding page number (1-based, default: all)
filePathYesAbsolute path to the large PDF file
outputSummaryNoWhether to provide a summary instead of full content (default: true)
startPageNoStarting page number (1-based, default: 1)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/multiluca2020/visum-thinker-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server