stream_large_file
Process large files efficiently by streaming them in manageable chunks, avoiding memory overload while enabling sequential data handling.
Instructions
Stream a large file in chunks. Returns multiple chunks for processing very large files efficiently.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| filePath | Yes | Absolute path to the file | |
| chunkSize | No | Chunk size in bytes (default: 65536 - 64KB) | |
| startOffset | No | Starting byte offset (default: 0) | |
| maxBytes | No | Maximum bytes to stream (optional) | |
| maxChunks | No | Maximum number of chunks to return (default: 10) |
Implementation Reference
- src/server.ts:385-421 (handler)The handleStreamFile method that executes the stream_large_file tool logic by consuming chunks from FileHandler.streamFile generator, collecting up to maxChunks, and returning a JSON-formatted response.private async handleStreamFile( args: Record<string, unknown> ): Promise<{ content: Array<{ type: string; text: string }> }> { const filePath = args.filePath as string; const chunkSize = (args.chunkSize as number) || 64 * 1024; const startOffset = args.startOffset as number | undefined; const maxBytes = args.maxBytes as number | undefined; const maxChunks = (args.maxChunks as number) || 10; const chunks: string[] = []; let chunkCount = 0; for await (const chunk of FileHandler.streamFile(filePath, { chunkSize, startOffset, maxBytes, })) { chunks.push(chunk); chunkCount++; if (chunkCount >= maxChunks) break; } return { content: [ { type: 'text', text: JSON.stringify({ totalChunks: chunks.length, chunks, note: chunks.length >= maxChunks ? 'Reached maxChunks limit. Increase maxChunks or use startOffset to continue.' : 'All chunks returned.', }, null, 2), }, ], }; }
- src/fileHandler.ts:515-537 (helper)The core FileHandler.streamFile async generator that creates a Node.js fs.ReadStream with specified chunkSize, offsets, and yields string chunks from the large file.static async *streamFile( filePath: string, options: StreamOptions = {} ): AsyncGenerator<string> { await this.verifyFile(filePath); const chunkSize = options.chunkSize || 64 * 1024; // 64KB default const encoding = options.encoding || 'utf-8'; const stream = fs.createReadStream(filePath, { encoding, start: options.startOffset, end: options.maxBytes ? (options.startOffset || 0) + options.maxBytes : undefined, highWaterMark: chunkSize, }); for await (const chunk of stream) { yield chunk; } } }
- src/server.ts:213-241 (registration)Registration of the stream_large_file tool in the getTools() method, including name, description, and input schema.name: 'stream_large_file', description: 'Stream a large file in chunks. Returns multiple chunks for processing very large files efficiently.', inputSchema: { type: 'object', properties: { filePath: { type: 'string', description: 'Absolute path to the file', }, chunkSize: { type: 'number', description: 'Chunk size in bytes (default: 65536 - 64KB)', }, startOffset: { type: 'number', description: 'Starting byte offset (default: 0)', }, maxBytes: { type: 'number', description: 'Maximum bytes to stream (optional)', }, maxChunks: { type: 'number', description: 'Maximum number of chunks to return (default: 10)', }, }, required: ['filePath'], }, },
- src/server.ts:260-261 (registration)Dispatch case in handleToolCall switch statement that routes calls to the stream_large_file handler.case 'stream_large_file': return this.handleStreamFile(args);
- src/types.ts:109-118 (schema)TypeScript interface defining the StreamOptions used in the streaming functionality.export interface StreamOptions { /** Chunk size for streaming */ chunkSize?: number; /** Starting byte offset */ startOffset?: number; /** Maximum bytes to stream */ maxBytes?: number; /** Encoding */ encoding?: BufferEncoding; }