Skip to main content
Glama

stream_large_file

Process large files efficiently by streaming them in manageable chunks without loading entire files into memory, enabling step-by-step handling of oversized data.

Instructions

Stream a large file in chunks. Returns multiple chunks for processing very large files efficiently.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
filePathYesAbsolute path to the file
chunkSizeNoChunk size in bytes (default: 65536 - 64KB)
startOffsetNoStarting byte offset (default: 0)
maxBytesNoMaximum bytes to stream (optional)
maxChunksNoMaximum number of chunks to return (default: 10)

Implementation Reference

  • The primary handler function for the 'stream_large_file' tool. It extracts parameters from arguments, uses FileHandler.streamFile to generate chunks, limits to maxChunks, and formats the response as MCP tool content.
    private async handleStreamFile( args: Record<string, unknown> ): Promise<{ content: Array<{ type: string; text: string }> }> { const filePath = args.filePath as string; const chunkSize = (args.chunkSize as number) || 64 * 1024; const startOffset = args.startOffset as number | undefined; const maxBytes = args.maxBytes as number | undefined; const maxChunks = (args.maxChunks as number) || 10; const chunks: string[] = []; let chunkCount = 0; for await (const chunk of FileHandler.streamFile(filePath, { chunkSize, startOffset, maxBytes, })) { chunks.push(chunk); chunkCount++; if (chunkCount >= maxChunks) break; } return { content: [ { type: 'text', text: JSON.stringify({ totalChunks: chunks.length, chunks, note: chunks.length >= maxChunks ? 'Reached maxChunks limit. Increase maxChunks or use startOffset to continue.' : 'All chunks returned.', }, null, 2), }, ], }; }
  • The input schema and metadata for the 'stream_large_file' tool, defining parameters, descriptions, and requirements. This is returned by getTools() for tool discovery.
    { name: 'stream_large_file', description: 'Stream a large file in chunks. Returns multiple chunks for processing very large files efficiently.', inputSchema: { type: 'object', properties: { filePath: { type: 'string', description: 'Absolute path to the file', }, chunkSize: { type: 'number', description: 'Chunk size in bytes (default: 65536 - 64KB)', }, startOffset: { type: 'number', description: 'Starting byte offset (default: 0)', }, maxBytes: { type: 'number', description: 'Maximum bytes to stream (optional)', }, maxChunks: { type: 'number', description: 'Maximum number of chunks to return (default: 10)', }, }, required: ['filePath'], }, },
  • src/server.ts:260-261 (registration)
    The switch case in handleToolCall that routes calls to the 'stream_large_file' tool to its handler function.
    case 'stream_large_file': return this.handleStreamFile(args);
  • The core streaming utility in FileHandler class. Creates a Node.js ReadStream with configurable chunk size, offset, and limits, yielding chunks as an async generator.
    static async *streamFile( filePath: string, options: StreamOptions = {} ): AsyncGenerator<string> { await this.verifyFile(filePath); const chunkSize = options.chunkSize || 64 * 1024; // 64KB default const encoding = options.encoding || 'utf-8'; const stream = fs.createReadStream(filePath, { encoding, start: options.startOffset, end: options.maxBytes ? (options.startOffset || 0) + options.maxBytes : undefined, highWaterMark: chunkSize, }); for await (const chunk of stream) { yield chunk; } }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/willianpinho/large-file-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server