Skip to main content
Glama

PocketFlow MCP Server

by tmtcomeup

analyze_github_repository

Analyze a GitHub repository to generate a beginner-friendly tutorial using the PocketFlow methodology, focusing on key abstractions and file patterns for clarity and simplicity.

Instructions

Analyze a GitHub repository and generate a comprehensive tutorial following the PocketFlow methodology

Input Schema

NameRequiredDescriptionDefault
api_keyYesAPI key for the LLM provider
exclude_patternsNoFile patterns to exclude (e.g., ["*test*", "*docs/*"])
github_tokenNoOptional GitHub token for private repos or rate limit avoidance
include_patternsNoFile patterns to include (e.g., ["*.py", "*.js"])
languageNoLanguage for tutorial generationenglish
llm_providerYesLLM provider to use for analysisgoogle
max_abstractionsNoMaximum number of abstractions to identify
max_file_sizeNoMaximum file size in bytes
modelNoSpecific model to use (e.g., "anthropic/claude-3.5-sonnet" for OpenRouter or "gemini-2.5-pro" for Google)gemini-2.5-pro
project_nameNoOptional project name (derived from repo if omitted)
repo_urlYesGitHub repository URL (e.g., https://github.com/user/repo)
use_cacheNoEnable LLM response caching

Input Schema (JSON Schema)

{ "properties": { "api_key": { "description": "API key for the LLM provider", "type": "string" }, "exclude_patterns": { "default": [ "*test*", "*docs/*", "*node_modules/*", "*.log" ], "description": "File patterns to exclude (e.g., [\"*test*\", \"*docs/*\"])", "items": { "type": "string" }, "type": "array" }, "github_token": { "description": "Optional GitHub token for private repos or rate limit avoidance", "type": "string" }, "include_patterns": { "default": [ "*.py", "*.js", "*.jsx", "*.ts", "*.tsx", "*.go", "*.java", "*.md" ], "description": "File patterns to include (e.g., [\"*.py\", \"*.js\"])", "items": { "type": "string" }, "type": "array" }, "language": { "default": "english", "description": "Language for tutorial generation", "type": "string" }, "llm_provider": { "default": "google", "description": "LLM provider to use for analysis", "enum": [ "openrouter", "google", "anthropic", "openai" ], "type": "string" }, "max_abstractions": { "default": 10, "description": "Maximum number of abstractions to identify", "maximum": 20, "minimum": 3, "type": "number" }, "max_file_size": { "default": 100000, "description": "Maximum file size in bytes", "type": "number" }, "model": { "default": "gemini-2.5-pro", "description": "Specific model to use (e.g., \"anthropic/claude-3.5-sonnet\" for OpenRouter or \"gemini-2.5-pro\" for Google)", "type": "string" }, "project_name": { "description": "Optional project name (derived from repo if omitted)", "type": "string" }, "repo_url": { "description": "GitHub repository URL (e.g., https://github.com/user/repo)", "type": "string" }, "use_cache": { "default": true, "description": "Enable LLM response caching", "type": "boolean" } }, "required": [ "repo_url", "llm_provider", "api_key" ], "type": "object" }
Install Server

Other Tools from PocketFlow MCP Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/tmtcomeup/pocketflow-mcp'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server