Skip to main content
Glama

githubViewRepoStructure

Analyze and explore GitHub repository structures to identify key directories, validate paths, and fetch essential files for project understanding. Supports adjustable depth levels for efficient navigation and insight.

Instructions

Explore GitHub repository structure and validate repository access.

PROJECT UNDERSTANDING:

  • Try to understand more by the structure of the project and the files in the project

  • Identify key directories and file patterns

  • fetch important files for better understanding

DEPTH CONTROL:

  • Default depth is 2 levels for balanced performance and insight

  • Maximum depth is 4 levels to prevent excessive API calls

  • Depth 1: Shows only immediate files/folders in the specified path

  • Depth 2+: Recursively explores subdirectories up to the specified depth

  • Higher depths provide more comprehensive project understanding but use more API calls

IMPORTANT:

  • verify default branch (use main or master if can't find default branch)

  • verify path before calling the tool to avoid errors

  • Start with root path to understand actual repository structure and then navigate to specific directories based on research needs

  • Check repository's default branch as it varies between repositories

  • Verify path exists - don't assume repository structure

  • Verify repository existence and accessibility

  • Validate paths before accessing specific files. Use github search code to find correct paths if unsure

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
branchYesBranch name. Default branch varies between repositories. Tool will automatically try default branch if specified branch is not found, but it is more efficient to verify the correct branch first.
depthNoDepth of directory structure to explore. Default is 2. Maximum is 4 to prevent excessive API calls and maintain performance.
includeIgnoredNoIf true, shows all files and folders including configuration files, lock files, hidden directories, etc. Default is false to show only relevant code files and directories. for optimization
ownerYesRepository owner/organization name (e.g., "facebook", "microsoft"). Do NOT include repository name.
pathNoDirectory path within repository. Start with empty path to see actual repository structure first. Do not assume repository structure or nested paths exist. Do not start with slash. Verify path exists before using specific directories.
repoYesRepository name under a organization.
showMediaNoIf true, shows media files (images, videos, audio, documents). Default is false to hide media files and focus on code structure.

Implementation Reference

  • Registers all configured tools on the MCP server, including githubViewRepoStructure via its config.fn (registerViewGitHubRepoStructureTool). Called during server initialization.
    export function registerTools( server: McpServer, callback?: ToolInvocationCallback ): { successCount: number; failedTools: string[]; } { const config = getServerConfig(); const toolsToRun = config.toolsToRun || []; const enableTools = config.enableTools || []; const disableTools = config.disableTools || []; let successCount = 0; const failedTools: string[] = []; if ( toolsToRun.length > 0 && (enableTools.length > 0 || disableTools.length > 0) ) { process.stderr.write( 'Warning: TOOLS_TO_RUN cannot be used together with ENABLE_TOOLS/DISABLE_TOOLS. Using TOOLS_TO_RUN exclusively.\n' ); } for (const tool of DEFAULT_TOOLS) { try { let shouldRegisterTool = false; let reason = ''; let isAvailableInMetadata = false; // Check metadata availability first (with error handling) try { isAvailableInMetadata = isToolAvailableSync(tool.name); } catch { // If metadata check fails, treat as unavailable isAvailableInMetadata = false; } // Skip silently if tool is missing from remote metadata if (!isAvailableInMetadata) { continue; } if (toolsToRun.length > 0) { shouldRegisterTool = toolsToRun.includes(tool.name); if (!shouldRegisterTool) { reason = 'not specified in TOOLS_TO_RUN configuration'; } } else { shouldRegisterTool = tool.isDefault; if (enableTools.includes(tool.name)) { shouldRegisterTool = true; } if (!shouldRegisterTool && reason === '') { reason = 'not a default tool'; } if (disableTools.includes(tool.name)) { shouldRegisterTool = false; reason = 'disabled by DISABLE_TOOLS configuration'; } } if (shouldRegisterTool) { tool.fn(server, callback); successCount++; } else if (reason) { process.stderr.write(`Tool ${tool.name} ${reason}\n`); } } catch (error) { failedTools.push(tool.name); } } return { successCount, failedTools }; }
  • Tool configuration entry that links the tool name to its registration function.
    export const GITHUB_VIEW_REPO_STRUCTURE: ToolConfig = { name: TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, description: getDescription(TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE), isDefault: true, type: 'content', fn: registerViewGitHubRepoStructureTool, };
  • Core handler function that processes multiple repository structure exploration queries in bulk, invoking GitHub API, applying filters, handling errors, and formatting standardized results.
    async function exploreMultipleRepositoryStructures( queries: GitHubViewRepoStructureQuery[], authInfo?: AuthInfo, sessionId?: string ): Promise<CallToolResult> { return executeBulkOperation( queries, async (query: GitHubViewRepoStructureQuery, _index: number) => { try { const apiRequest = buildStructureApiRequest(query); const apiResult = await viewGitHubRepositoryStructureAPI( apiRequest, authInfo, sessionId ); const apiError = handleApiError(apiResult, query); if (apiError) { return createEmptyStructureResult(query, apiError); } if (!('files' in apiResult) || !Array.isArray(apiResult.files)) { return createEmptyStructureResult( query, handleCatchError( new Error('Invalid API response structure'), query )! ); } const { filteredFiles, filteredFolders } = filterStructureItems(apiResult); const pathPrefix = apiRequest.path || '/'; const normalizedPrefix = pathPrefix === '/' ? '' : pathPrefix; const filePaths = filteredFiles.map(file => removePathPrefix(file.path, normalizedPrefix) ); const folderPaths = filteredFolders.map(folder => removePathPrefix(folder.path, normalizedPrefix) ); const hasContent = filePaths.length > 0 || folderPaths.length > 0; return createSuccessResult( query, { owner: apiRequest.owner, repo: apiRequest.repo, path: apiRequest.path || '/', files: filePaths, folders: folderPaths, }, hasContent, 'GITHUB_VIEW_REPO_STRUCTURE' ); } catch (error) { const catchError = handleCatchError( error, query, 'Failed to explore repository structure' ); return createEmptyStructureResult(query, catchError); } }, { toolName: TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, keysPriority: ['path', 'files', 'folders', 'error'] satisfies Array< keyof RepoStructureResult >, } ); }
  • Specific registration function for the tool, defining schema reference, annotations, security wrapper, and handler invocation.
    export function registerViewGitHubRepoStructureTool( server: McpServer, callback?: ToolInvocationCallback ) { return server.registerTool( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, { description: DESCRIPTIONS[TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE], inputSchema: GitHubViewRepoStructureBulkQuerySchema, annotations: { title: 'GitHub Repository Structure Explorer', readOnlyHint: true, destructiveHint: false, idempotentHint: true, openWorldHint: true, }, }, withSecurityValidation( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, async ( args: { queries: GitHubViewRepoStructureQuery[]; }, authInfo, sessionId ): Promise<CallToolResult> => { const queries = args.queries || []; if (callback) { try { await callback(TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, queries); } catch { // ignore } } return exploreMultipleRepositoryStructures( queries, authInfo, sessionId ); } ) ); }
  • Zod schemas for single and bulk input queries, with validation and descriptions for repo structure exploration.
    export const GitHubViewRepoStructureQuerySchema = BaseQuerySchema.extend({ owner: z .string() .min(1) .max(200) .describe(GITHUB_VIEW_REPO_STRUCTURE.scope.owner), repo: z .string() .min(1) .max(150) .describe(GITHUB_VIEW_REPO_STRUCTURE.scope.repo), branch: z .string() .min(1) .max(255) .describe(GITHUB_VIEW_REPO_STRUCTURE.scope.branch), path: z .string() .default('') .optional() .describe(GITHUB_VIEW_REPO_STRUCTURE.scope.path), depth: z .number() .min(1) .max(2) .default(1) .optional() .describe(GITHUB_VIEW_REPO_STRUCTURE.range.depth), }); export const GitHubViewRepoStructureBulkQuerySchema = createBulkQuerySchema( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, GitHubViewRepoStructureQuerySchema );
  • Helper function that performs the actual GitHub API calls to fetch repository contents/structure, handles branch fallbacks, recursive directory exploration, filtering ignored files/dirs, truncation, and sorting.
    export async function viewGitHubRepositoryStructureAPI( params: GitHubViewRepoStructureQuery, authInfo?: AuthInfo, sessionId?: string ): Promise<GitHubRepositoryStructureResult | GitHubRepositoryStructureError> { const cacheKey = generateCacheKey('gh-repo-structure-api', params, sessionId); const result = await withDataCache< GitHubRepositoryStructureResult | GitHubRepositoryStructureError >( cacheKey, async () => { return await viewGitHubRepositoryStructureAPIInternal(params, authInfo); }, { shouldCache: value => !('error' in value), } ); return result; } async function viewGitHubRepositoryStructureAPIInternal( params: GitHubViewRepoStructureQuery, authInfo?: AuthInfo ): Promise<GitHubRepositoryStructureResult | GitHubRepositoryStructureError> { try { const octokit = await getOctokit(authInfo); const { owner, repo, branch, path = '', depth = 1 } = params; const cleanPath = path.startsWith('/') ? path.substring(1) : path; let result; let workingBranch = branch; try { result = await octokit.rest.repos.getContent({ owner, repo, path: cleanPath || undefined, ref: branch, }); } catch (error: unknown) { if (error instanceof RequestError && error.status === 404) { let defaultBranch = 'main'; try { const repoInfo = await octokit.rest.repos.get({ owner, repo }); defaultBranch = repoInfo.data.default_branch || 'main'; } catch (repoError) { const apiError = handleGitHubAPIError(repoError); await logSessionError( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, REPOSITORY_ERRORS.NOT_FOUND.code ); return { error: REPOSITORY_ERRORS.NOT_FOUND.message( owner, repo, apiError.error ), status: apiError.status, }; } if (defaultBranch !== branch) { try { result = await octokit.rest.repos.getContent({ owner, repo, path: cleanPath || undefined, ref: defaultBranch, }); workingBranch = defaultBranch; } catch (fallbackError) { const commonBranches = ['main', 'master', 'develop']; let foundBranch = null; for (const tryBranch of commonBranches) { if (tryBranch === branch || tryBranch === defaultBranch) continue; try { result = await octokit.rest.repos.getContent({ owner, repo, path: cleanPath || undefined, ref: tryBranch, }); foundBranch = tryBranch; workingBranch = tryBranch; break; } catch { // ignore } } if (!foundBranch) { const apiError = handleGitHubAPIError(error); await logSessionError( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, REPOSITORY_ERRORS.PATH_NOT_FOUND_ANY_BRANCH.code ); return { error: REPOSITORY_ERRORS.PATH_NOT_FOUND_ANY_BRANCH.message( cleanPath, owner, repo ), status: apiError.status, triedBranches: [branch, defaultBranch, ...commonBranches], defaultBranch, }; } } } else { const apiError = handleGitHubAPIError(error); await logSessionError( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, REPOSITORY_ERRORS.PATH_NOT_FOUND.code ); return { error: REPOSITORY_ERRORS.PATH_NOT_FOUND.message( cleanPath, owner, repo, branch ), status: apiError.status, }; } } else { const apiError = handleGitHubAPIError(error); await logSessionError( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, REPOSITORY_ERRORS.ACCESS_FAILED.code ); return { error: REPOSITORY_ERRORS.ACCESS_FAILED.message( owner, repo, apiError.error ), status: apiError.status, rateLimitRemaining: apiError.rateLimitRemaining, rateLimitReset: apiError.rateLimitReset, }; } } const items = Array.isArray(result.data) ? result.data : [result.data]; const apiItems: GitHubApiFileItem[] = items.map( (item: GitHubApiFileItem) => ({ name: item.name, path: item.path, type: item.type as 'file' | 'dir', size: 'size' in item ? item.size : undefined, download_url: 'download_url' in item ? item.download_url : undefined, url: item.url, html_url: item.html_url, git_url: item.git_url, sha: item.sha, }) ); let allItems = apiItems; if (depth > 1) { const recursiveItems = await fetchDirectoryContentsRecursivelyAPI( octokit, owner, repo, workingBranch, cleanPath, 1, depth ); const combinedItems = [...apiItems, ...recursiveItems]; allItems = combinedItems.filter( (item, index, array) => array.findIndex(i => i.path === item.path) === index ); } const filteredItems = allItems.filter(item => { if (item.type === 'dir') { return !shouldIgnoreDir(item.name); } return !shouldIgnoreFile(item.path); }); const itemLimit = Math.min(200, 50 * depth); const limitedItems = filteredItems.slice(0, itemLimit); limitedItems.sort((a, b) => { if (a.type !== b.type) { return a.type === 'dir' ? -1 : 1; } const aDepth = a.path.split('/').length; const bDepth = b.path.split('/').length; if (aDepth !== bDepth) { return aDepth - bDepth; } return a.path.localeCompare(b.path); }); const files = limitedItems .filter(item => item.type === 'file') .map(item => ({ path: item.path.startsWith('/') ? item.path : `/${item.path}`, size: item.size, url: item.path, })); const folders = limitedItems .filter(item => item.type === 'dir') .map(item => ({ path: item.path.startsWith('/') ? item.path : `/${item.path}`, url: item.path, })); return { owner, repo, branch: workingBranch, path: cleanPath || '/', apiSource: true, summary: { totalFiles: files.length, totalFolders: folders.length, truncated: allItems.length > limitedItems.length, filtered: true, originalCount: allItems.length, }, files: files, folders: { count: folders.length, folders: folders, }, }; } catch (error: unknown) { const apiError = handleGitHubAPIError(error); await logSessionError( TOOL_NAMES.GITHUB_VIEW_REPO_STRUCTURE, REPOSITORY_ERRORS.STRUCTURE_EXPLORATION_FAILED.code ); return { error: REPOSITORY_ERRORS.STRUCTURE_EXPLORATION_FAILED.message, status: apiError.status, rateLimitRemaining: apiError.rateLimitRemaining, rateLimitReset: apiError.rateLimitReset, }; } }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bgauryy/octocode-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server