Local DeepWiki MCP Server
A local, privacy-focused MCP server that generates DeepWiki-style documentation for private repositories with RAG-based Q&A capabilities.
Features
Multi-language code parsing using tree-sitter (Python, TypeScript/JavaScript, Go, Rust, Java, C/C++)
AST-based chunking that respects code structure (functions, classes, methods)
Semantic search using LanceDB vector database
LLM-powered wiki generation with support for Ollama (local), Anthropic, and OpenAI
Configurable embeddings - local (sentence-transformers) or OpenAI
Incremental indexing - only re-process changed files
RAG-based Q&A - ask questions about your codebase
Installation
Using uv (recommended)
Using pip
Configuration
Create a config file at ~/.config/local-deepwiki/config.yaml:
Claude Code Integration
Add to your Claude Code MCP config (~/.claude/claude_code_config.json):
MCP Tools
index_repository
Index a repository and generate wiki documentation.
ask_question
Ask a question about the codebase using RAG.
read_wiki_structure
Get the wiki table of contents.
read_wiki_page
Read a specific wiki page.
search_code
Semantic search across the codebase.
Environment Variables
ANTHROPIC_API_KEY- Required for Anthropic LLM providerOPENAI_API_KEY- Required for OpenAI LLM/embedding providers
Prerequisites
For local LLM support:
Ollama installed and running
A model pulled (e.g.,
ollama pull llama3.2)
Development
Architecture
License
MIT