Skip to main content
Glama

PocketFlow MCP Server

by tmtcomeup

PocketFlow MCP Server

A Model Context Protocol (MCP) server that brings the powerful PocketFlow tutorial generation methodology to all AI assistants. Generate comprehensive, beginner-friendly tutorials from any GitHub repository using advanced AI analysis.

What is PocketFlow?

PocketFlow is an innovative methodology for automatically generating high-quality tutorials from codebases. It:

  1. Identifies Core Abstractions - Finds the key concepts and components in a codebase

  2. Maps Relationships - Understands how different parts interact with each other

  3. Orders Explanations - Determines the best sequence to explain concepts

  4. Generates Tutorials - Creates beginner-friendly, step-by-step learning content

  5. Creates Visual Diagrams - Includes Mermaid diagrams for better understanding

Related MCP server: TypeScript Prompt MCP Server

Features

  • Universal AI Assistant Support - Works with Cline, Cursor, Claude Desktop, and any MCP-compatible client

  • 🔍 Deep Repository Analysis - Analyzes GitHub repositories to identify key abstractions

  • 🧠 Intelligent Concept Mapping - Understands relationships between code components

  • 📚 Comprehensive Tutorial Generation - Creates structured, beginner-friendly tutorials

  • 📊 Visual Architecture Diagrams - Generates Mermaid flowcharts and sequence diagrams

  • 🌐 Multi-LLM Provider Support - OpenRouter, Google Gemini, Anthropic Claude, OpenAI

  • 🌍 Multi-Language Support - Generate tutorials in different languages

  • 🔒 Secure & Local - All processing happens locally, API keys stored securely

  • Smart Caching - Caches LLM responses for faster subsequent runs

Quick Start

Prerequisites

  • Node.js 18+

  • npm or yarn

  • An API key for your preferred LLM provider

Installation

  1. Clone and Build

git clone https://github.com/tmtcomeup/pocketflow-mcp.git cd pocketflow-mcp npm install npm run build
  1. Configure Your AI Assistant

For Cline (VSCode)

Add to your Cline settings:

{ "mcpServers": { "pocketflow": { "command": "node", "args": ["path/to/pocketflow-mcp/build/index.js"] } } }

For Claude Desktop

Add to claude_desktop_config.json:

{ "mcpServers": { "pocketflow": { "command": "node", "args": ["path/to/pocketflow-mcp/build/index.js"] } } }

Usage

Once connected, you'll have access to these tools:

analyze_github_repository

Generate a complete tutorial from any GitHub repository:

// Basic usage analyze_github_repository({ repo_url: "https://github.com/microsoft/vscode", llm_provider: "openrouter", api_key: "sk-or-v1-your-key-here", model: "anthropic/claude-3.5-sonnet" }) // Advanced options analyze_github_repository({ repo_url: "https://github.com/pytorch/pytorch", llm_provider: "google", api_key: "your-gemini-key", model: "gemini-2.5-pro", max_abstractions: 8, language: "spanish", include_patterns: ["*.py", "*.md"], exclude_patterns: ["*test*", "*docs/*"] })

get_repository_structure

Explore repository structure before analysis:

get_repository_structure({ repo_url: "https://github.com/facebook/react", include_patterns: ["*.js", "*.jsx", "*.ts"], max_depth: 3 })

LLM Provider Setup

OpenRouter (Recommended)

  • Sign up at openrouter.ai

  • Get your API key from the dashboard

  • Access 100+ models including Claude, GPT-4, Gemini, and more

Google Gemini

  • Get an API key from Google AI Studio

  • Use models like gemini-2.5-pro or gemini-2.5-flash

Anthropic Claude

OpenAI

How It Works

The PocketFlow methodology follows a 6-step process:

  1. Repository Fetching - Downloads and filters code files based on patterns

  2. Abstraction Identification - Uses AI to identify 5-10 core concepts in the codebase

  3. Relationship Analysis - Maps how abstractions interact with each other

  4. Chapter Ordering - Determines the optimal learning sequence

  5. Chapter Writing - Generates detailed, beginner-friendly explanations for each concept

  6. Tutorial Compilation - Combines everything into a cohesive tutorial with diagrams

Example Output

The generated tutorial includes:

  • Index Page with project overview and visual architecture diagram

  • Individual Chapters for each core abstraction

  • Mermaid Diagrams showing relationships and workflows

  • Code Examples with detailed explanations

  • Cross-References between related concepts

  • Beginner-Friendly Language with analogies and examples

Configuration Options

Parameter

Description

Default

repo_url

GitHub repository URL

Required

llm_provider

AI provider (

openrouter

,

google

,

anthropic

,

openai

)

Required

api_key

API key for the LLM provider

Required

model

Specific model to use

Provider default

max_abstractions

Number of key concepts to identify

10

language

Tutorial language

"english"

include_patterns

File patterns to analyze

Common code files

exclude_patterns

File patterns to skip

Tests, docs, builds

max_file_size

Maximum file size in bytes

100000

use_cache

Enable LLM response caching

true

Contributing

We welcome contributions! Please see our contributing guidelines.

License

MIT License - see LICENSE file for details.

Original PocketFlow

This MCP server is based on the original PocketFlow project by The-Pocket. We've adapted their brilliant methodology to work seamlessly with all MCP-compatible AI assistants.

Support


Ready to transform any codebase into a comprehensive learning resource? Start analyzing repositories with PocketFlow MCP today!

Deploy Server
A
security – no known vulnerabilities
-
license - not tested
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tmtcomeup/pocketflow-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server