Skip to main content
Glama

stream_large_file

Process large files efficiently by streaming them in manageable chunks, avoiding memory overload while enabling sequential data handling.

Instructions

Stream a large file in chunks. Returns multiple chunks for processing very large files efficiently.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
filePathYesAbsolute path to the file
chunkSizeNoChunk size in bytes (default: 65536 - 64KB)
startOffsetNoStarting byte offset (default: 0)
maxBytesNoMaximum bytes to stream (optional)
maxChunksNoMaximum number of chunks to return (default: 10)

Implementation Reference

  • The handleStreamFile method that executes the stream_large_file tool logic by consuming chunks from FileHandler.streamFile generator, collecting up to maxChunks, and returning a JSON-formatted response.
    private async handleStreamFile( args: Record<string, unknown> ): Promise<{ content: Array<{ type: string; text: string }> }> { const filePath = args.filePath as string; const chunkSize = (args.chunkSize as number) || 64 * 1024; const startOffset = args.startOffset as number | undefined; const maxBytes = args.maxBytes as number | undefined; const maxChunks = (args.maxChunks as number) || 10; const chunks: string[] = []; let chunkCount = 0; for await (const chunk of FileHandler.streamFile(filePath, { chunkSize, startOffset, maxBytes, })) { chunks.push(chunk); chunkCount++; if (chunkCount >= maxChunks) break; } return { content: [ { type: 'text', text: JSON.stringify({ totalChunks: chunks.length, chunks, note: chunks.length >= maxChunks ? 'Reached maxChunks limit. Increase maxChunks or use startOffset to continue.' : 'All chunks returned.', }, null, 2), }, ], }; }
  • The core FileHandler.streamFile async generator that creates a Node.js fs.ReadStream with specified chunkSize, offsets, and yields string chunks from the large file.
    static async *streamFile( filePath: string, options: StreamOptions = {} ): AsyncGenerator<string> { await this.verifyFile(filePath); const chunkSize = options.chunkSize || 64 * 1024; // 64KB default const encoding = options.encoding || 'utf-8'; const stream = fs.createReadStream(filePath, { encoding, start: options.startOffset, end: options.maxBytes ? (options.startOffset || 0) + options.maxBytes : undefined, highWaterMark: chunkSize, }); for await (const chunk of stream) { yield chunk; } } }
  • src/server.ts:213-241 (registration)
    Registration of the stream_large_file tool in the getTools() method, including name, description, and input schema.
    name: 'stream_large_file', description: 'Stream a large file in chunks. Returns multiple chunks for processing very large files efficiently.', inputSchema: { type: 'object', properties: { filePath: { type: 'string', description: 'Absolute path to the file', }, chunkSize: { type: 'number', description: 'Chunk size in bytes (default: 65536 - 64KB)', }, startOffset: { type: 'number', description: 'Starting byte offset (default: 0)', }, maxBytes: { type: 'number', description: 'Maximum bytes to stream (optional)', }, maxChunks: { type: 'number', description: 'Maximum number of chunks to return (default: 10)', }, }, required: ['filePath'], }, },
  • src/server.ts:260-261 (registration)
    Dispatch case in handleToolCall switch statement that routes calls to the stream_large_file handler.
    case 'stream_large_file': return this.handleStreamFile(args);
  • TypeScript interface defining the StreamOptions used in the streaming functionality.
    export interface StreamOptions { /** Chunk size for streaming */ chunkSize?: number; /** Starting byte offset */ startOffset?: number; /** Maximum bytes to stream */ maxBytes?: number; /** Encoding */ encoding?: BufferEncoding; }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/willianpinho/large-file-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server