Skip to main content
Glama
orzcls

Gemini CLI MCP Server

by orzcls

fetch-chunk

Retrieves cached chunks from a changeMode response to access subsequent data after receiving partial responses.

Instructions

Retrieves cached chunks from a changeMode response. Use this to get subsequent chunks after receiving a partial changeMode response.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
cacheKeyYesThe cache key provided in the initial changeMode response
chunkIndexYesWhich chunk to retrieve (1-based index)

Implementation Reference

  • MCP CallTool handler for 'fetch-chunk': destructures args, calls getChunkedEdits helper, formats chunk response with progress info and instructions for fetching more chunks if available.
    case "fetch-chunk": const { cacheKey, chunkIndex: fetchChunkIndex } = args; console.error('[GMCPT] fetch-chunk tool called with cacheKey: ' + cacheKey + ', chunkIndex: ' + fetchChunkIndex); try { const chunkResult = getChunkedEdits(cacheKey, parseInt(fetchChunkIndex)); // Format the chunk information const chunkInfo = `CHUNK ${chunkResult.chunk}/${chunkResult.totalChunks} (Cache Key: ${chunkResult.cacheKey})\n\n${chunkResult.content}`; if (chunkResult.hasMore) { const nextChunk = chunkResult.chunk + 1; const remainingChunks = chunkResult.totalChunks - chunkResult.chunk; return { content: [{ type: "text", text: chunkInfo + `\n\n[Use fetch-chunk tool with cacheKey "${chunkResult.cacheKey}" and chunkIndex ${nextChunk}-${chunkResult.totalChunks} to get remaining ${remainingChunks} chunks]` }] }; } else { return { content: [{ type: "text", text: chunkInfo + "\n\n[This is the final chunk]" }] }; } } catch (error) { return { content: [{ type: "text", text: `Error retrieving chunk: ${error.message}` }] }; }
  • Input schema definition for the fetch-chunk tool, specifying required cacheKey (string) and chunkIndex (number >=1).
    { name: "fetch-chunk", description: "Retrieves cached chunks from a changeMode response. Use this to get subsequent chunks after receiving a partial changeMode response.", inputSchema: { type: "object", properties: { cacheKey: { type: "string", description: "The cache key provided in the initial changeMode response" }, chunkIndex: { type: "number", minimum: 1, description: "Which chunk to retrieve (1-based index)" } }, required: ["cacheKey", "chunkIndex"] } },
  • Registration via ListToolsRequestSchema handler returning the tools array which includes the fetch-chunk tool definition.
    server.setRequestHandler(ListToolsRequestSchema, async () => { return { tools }; });
  • Core helper function implementing chunk retrieval logic from in-memory Map cache, validates existence and index, returns chunk metadata including hasMore flag.
    export function getChunkedEdits(cacheKey, chunkIndex) { try { const chunks = getChunks(cacheKey); if (!chunks || chunks.length === 0) { throw new Error('No cached chunks found for the provided cache key'); } const chunk = chunks[chunkIndex - 1]; // Convert to 0-based index if (!chunk) { throw new Error(`Chunk ${chunkIndex} not found. Available chunks: 1-${chunks.length}`); } return { content: chunk, chunk: chunkIndex, totalChunks: chunks.length, cacheKey: cacheKey, hasMore: chunkIndex < chunks.length }; } catch (error) { console.error(`Failed to retrieve chunk: ${error.message}`); throw error; } }
  • Simple cache getter utility used by getChunkedEdits.
    function getChunks(key) { return chunkCache.get(key) || []; }
  • Simple cache setter utility used during changeMode chunking in executeGeminiCLI.
    function cacheChunks(key, chunks) { chunkCache.set(key, chunks); }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orzcls/gemini-mcp-tool-windows-fixed'

If you have feedback or need assistance with the MCP directory API, please join our Discord server