githubGetFileContent
Fetch content from multiple GitHub repository files simultaneously with up to 5 parallel queries. Optimize token usage by accessing partial file sections and enable automatic fallback handling for efficient workflow integration.
Instructions
Fetches the content of multiple files from GitHub repositories in parallel. Supports up to 5 queries with automatic fallback handling.
TOKEN OPTIMIZATION:
Full file content is expensive in tokens. Use startLine/endLine for partial access
Large files should be accessed in parts rather than full content
Use minified=true (default) to optimize content for token efficiency
BULK QUERY FEATURES:
queries: array of up to 5 different file fetch queries for parallel execution
Each query can have fallbackParams for automatic retry with modified parameters
Optimizes workflow by executing multiple file fetches simultaneously
Each query should target different files or sections
Fallback logic automatically adjusts parameters if original query fails
Automatic main/master branch fallback for each query
Use for comprehensive file analysis - query different files, sections, or implementations in one call.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| queries | Yes | Array of up to 5 different file fetch queries for parallel execution |
Implementation Reference
- Main execution handler: processes bulk queries, calls GitHub API via fileOperations, handles results and errors.async function fetchMultipleGitHubFileContents( queries: FileContentQuery[], authInfo?: AuthInfo, sessionId?: string ): Promise<CallToolResult> { return executeBulkOperation( queries, async (query: FileContentQuery, _index: number) => { try { const apiRequest = buildApiRequest(query); const apiResult = await fetchGitHubFileContentAPI( apiRequest, authInfo, sessionId ); const apiError = handleApiError(apiResult, query); if (apiError) return apiError; const result = 'data' in apiResult ? apiResult.data : apiResult; const resultWithSampling = result as Record<string, unknown>; const hasContent = hasValidContent(result); return createSuccessResult( query, resultWithSampling, hasContent, 'GITHUB_FETCH_CONTENT' ); } catch (error) { return handleCatchError(error, query); } }, { toolName: TOOL_NAMES.GITHUB_FETCH_CONTENT, keysPriority: [ 'path', 'owner', 'repo', 'branch', 'contentLength', 'content', 'isPartial', 'startLine', 'endLine', 'minified', 'minificationFailed', 'minificationType', 'securityWarnings', 'sampling', 'error', ] satisfies Array<keyof ContentResult>, } ); }
- Core helper: Fetches file from GitHub API, decodes base64, sanitizes, minifies, extracts lines/match context, handles errors/caching.export async function fetchGitHubFileContentAPI( params: FileContentQuery, authInfo?: AuthInfo, sessionId?: string ): Promise<GitHubAPIResponse<ContentResult>> { const cacheKey = generateCacheKey( 'gh-api-file-content', { owner: params.owner, repo: params.repo, path: params.path, branch: params.branch, ...(params.fullContent && { fullContent: params.fullContent }), startLine: params.startLine, endLine: params.endLine, matchString: params.matchString, minified: params.minified, matchStringContextLines: params.matchStringContextLines, }, sessionId ); const result = await withDataCache<GitHubAPIResponse<ContentResult>>( cacheKey, async () => { return await fetchGitHubFileContentAPIInternal(params, authInfo); }, { shouldCache: (value: GitHubAPIResponse<ContentResult>) => 'data' in value && !(value as { error?: unknown }).error, } ); return result; } async function fetchGitHubFileContentAPIInternal( params: FileContentQuery, authInfo?: AuthInfo ): Promise<GitHubAPIResponse<ContentResult>> { try { const octokit = await getOctokit(authInfo); const { owner, repo, path: filePath, branch } = params; const contentParams: GetContentParameters = { owner, repo, path: filePath, ...(branch && { ref: branch }), }; let result; try { result = await octokit.rest.repos.getContent(contentParams); } catch (error: unknown) { if (error instanceof RequestError && error.status === 404 && branch) { if (branch === 'main' || branch === 'master') { const fallbackBranch = branch === 'main' ? 'master' : 'main'; try { result = await octokit.rest.repos.getContent({ ...contentParams, ref: fallbackBranch, }); } catch { throw error; } } else { const apiError = handleGitHubAPIError(error); return { ...apiError, scopesSuggestion: `Branch '${branch}' not found. Ask user: Do you want to get the file from the default branch instead?`, }; } } else { throw error; } } const data = result.data; if (Array.isArray(data)) { await logSessionError( TOOL_NAMES.GITHUB_FETCH_CONTENT, FILE_OPERATION_ERRORS.PATH_IS_DIRECTORY.code ); return { error: FILE_OPERATION_ERRORS.PATH_IS_DIRECTORY.message( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE ), type: 'unknown' as const, status: 400, }; } if ('content' in data && data.type === 'file') { const fileSize = data.size || 0; const MAX_FILE_SIZE = 300 * 1024; if (fileSize > MAX_FILE_SIZE) { const fileSizeKB = Math.round(fileSize / 1024); const maxSizeKB = Math.round(MAX_FILE_SIZE / 1024); await logSessionError( TOOL_NAMES.GITHUB_FETCH_CONTENT, FILE_OPERATION_ERRORS.FILE_TOO_LARGE.code ); return { error: FILE_OPERATION_ERRORS.FILE_TOO_LARGE.message( fileSizeKB, maxSizeKB, TOOL_NAMES.GITHUB_SEARCH_CODE ), type: 'unknown' as const, status: 413, }; } if (!data.content) { await logSessionError( TOOL_NAMES.GITHUB_FETCH_CONTENT, FILE_OPERATION_ERRORS.FILE_EMPTY.code ); return { error: FILE_OPERATION_ERRORS.FILE_EMPTY.message, type: 'unknown' as const, status: 404, }; } const base64Content = data.content.replace(/\s/g, ''); if (!base64Content) { await logSessionError( TOOL_NAMES.GITHUB_FETCH_CONTENT, FILE_OPERATION_ERRORS.FILE_EMPTY.code ); return { error: FILE_OPERATION_ERRORS.FILE_EMPTY.message, type: 'unknown' as const, status: 404, }; } let decodedContent: string; try { const buffer = Buffer.from(base64Content, 'base64'); if (buffer.indexOf(0) !== -1) { await logSessionError( TOOL_NAMES.GITHUB_FETCH_CONTENT, FILE_OPERATION_ERRORS.BINARY_FILE.code ); return { error: FILE_OPERATION_ERRORS.BINARY_FILE.message, type: 'unknown' as const, status: 415, }; } decodedContent = buffer.toString('utf-8'); } catch (decodeError) { await logSessionError( TOOL_NAMES.GITHUB_FETCH_CONTENT, FILE_OPERATION_ERRORS.DECODE_FAILED.code ); return { error: FILE_OPERATION_ERRORS.DECODE_FAILED.message, type: 'unknown' as const, status: 422, }; } const result = await processFileContentAPI( decodedContent, owner, repo, branch || data.sha, filePath, params.minified !== false, params.fullContent || false, params.startLine, params.endLine, params.matchStringContextLines ?? 5, params.matchString ); if ('error' in result) { return { error: result.error || 'Unknown error', status: 500, type: 'unknown' as const, }; } else { return { data: result, status: 200, }; } } await logSessionError( TOOL_NAMES.GITHUB_FETCH_CONTENT, FILE_OPERATION_ERRORS.UNSUPPORTED_TYPE.code ); return { error: FILE_OPERATION_ERRORS.UNSUPPORTED_TYPE.message(data.type), type: 'unknown' as const, status: 415, }; } catch (error: unknown) { const apiError = handleGitHubAPIError(error); return apiError; } }
- Input schema definition using Zod for single and bulk file content queries with validation.export const FileContentQuerySchema = BaseQuerySchema.extend({ owner: z.string().min(1).max(200).describe(GITHUB_FETCH_CONTENT.scope.owner), repo: z.string().min(1).max(150).describe(GITHUB_FETCH_CONTENT.scope.repo), minified: z .boolean() .optional() .default(true) .describe(GITHUB_FETCH_CONTENT.processing.minified), sanitize: z .boolean() .optional() .default(true) .describe(GITHUB_FETCH_CONTENT.processing.sanitize), path: z.string().describe(GITHUB_FETCH_CONTENT.scope.path), branch: z .string() .min(1) .max(255) .optional() .describe(GITHUB_FETCH_CONTENT.scope.branch), fullContent: z .boolean() .default(false) .describe(GITHUB_FETCH_CONTENT.range.fullContent), startLine: z .number() .int() .min(1) .optional() .describe(GITHUB_FETCH_CONTENT.range.startLine), endLine: z .number() .int() .min(1) .optional() .describe(GITHUB_FETCH_CONTENT.range.endLine), matchString: z .string() .optional() .describe(GITHUB_FETCH_CONTENT.range.matchString), matchStringContextLines: z .number() .int() .min(1) .max(50) .default(5) .describe(GITHUB_FETCH_CONTENT.range.matchStringContextLines), }).refine( data => { if ( data.fullContent && (data.startLine || data.endLine || data.matchString) ) { return false; } if ( (data.startLine && !data.endLine) || (!data.startLine && data.endLine) ) { return false; } return true; }, { message: GITHUB_FETCH_CONTENT.validation.parameterConflict, } ); export const FileContentBulkQuerySchema = createBulkQuerySchema( TOOL_NAMES.GITHUB_FETCH_CONTENT, FileContentQuerySchema );
- Registers the tool named 'githubGetFileContent' on MCP server with schema, annotations, and security-wrapped handler.export function registerFetchGitHubFileContentTool( server: McpServer, callback?: ToolInvocationCallback ) { return server.registerTool( TOOL_NAMES.GITHUB_FETCH_CONTENT, { description: DESCRIPTIONS[TOOL_NAMES.GITHUB_FETCH_CONTENT], inputSchema: FileContentBulkQuerySchema, annotations: { title: 'GitHub File Content Fetch', readOnlyHint: true, destructiveHint: false, idempotentHint: true, openWorldHint: true, }, }, withSecurityValidation( TOOL_NAMES.GITHUB_FETCH_CONTENT, async ( args: { queries: FileContentQuery[]; }, authInfo, sessionId ): Promise<CallToolResult> => { const queries = args.queries || []; if (callback) { try { await callback(TOOL_NAMES.GITHUB_FETCH_CONTENT, queries); } catch { // ignore } } return fetchMultipleGitHubFileContents(queries, authInfo, sessionId); } ) ); }
- packages/octocode-mcp/src/tools/toolConfig.ts:33-39 (registration)Tool configuration that maps to the registration function and includes it in default tools list.export const GITHUB_FETCH_CONTENT: ToolConfig = { name: TOOL_NAMES.GITHUB_FETCH_CONTENT, description: getDescription(TOOL_NAMES.GITHUB_FETCH_CONTENT), isDefault: true, type: 'content', fn: registerFetchGitHubFileContentTool, };