Provides code intelligence for C++ projects, enabling symbol search, definition lookups, and analysis of call and type hierarchies.
Supports C++ projects using CMake by utilizing exported compilation commands for precise code navigation and analysis.
Integrates with Google Gemini models to provide AI-powered documentation summarization, pattern detection, and architectural insights.
Utilizes the clangd LSP from the LLVM project to provide high-accuracy navigation and symbol tracking for C and C++ codebases.
Provides integration for Makefile-based projects by consuming compilation databases to facilitate code structure analysis.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Clangarooshow the call hierarchy for the process_event function"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.

π¦ Clangaroo: Fast C++ code intelligence for LLMs via MCP
β¨ About
NOTE (January 2026): Claude Code now has built-in support for LSPs, making this unnecessary. Since it may still be useful in other agentic harnesses I'll leave the project here for now.
Clangaroo enables Claude Code, Gemini CLI, and other coding agents to jump around your C++ codebase with ease. Clangaroo provides fast, direct lookup of C/C++ symbols, functions, definitions, call hierarchies, type hierarchies, and more by your bestest LLM pals.
Clangaroo combines the speed of Tree-sitter parsing with the accuracy of clangd LSP, optionally enhanced by Google Gemini Flash AI for deeper insights. Let your AI buddies spend more time coding and less time stumbling around.
But WHY did you make this? I β€οΈ using Claude Code, but every time it auto-compacts and then starts grepping around for the function we've been working on for forever, I die a little bit inside. But aren't there already a few MCPs that do this - why do we need another? I spent some time searching and found both MCP-language-server and Serena, which both look perfectly nice! Unfortunately, neither worked for me π
Clangaroo is meant to be super simple and is intended to 'just work'.
π Table of Contents
π Quick Start
1. Install Clangaroo
2. Special compilation step for your C++ project
The clang LSP needs you to do this once:
This will create a special compile_commands.json file in your project root.
3. Configure Claude Desktop or other MCP client
Did you know you can now add MCP servers to LM Studio?
N.B.: Use of
Note: Please replace 'command' and 'project' with correct paths for your system, and replace your-google-ai-api-key with your API key (if using one). If you don't wish to use the AI enhanced services, simply leave out all the --ai options and the API key.
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
Default depth of AI analysis (--ai-analysis-level, default: summary).
summary: Quick overview with key pointsdetailed: Comprehensive analysis with examples and context
Default depth of context (--ai-context-level, default: minimal).
minimal: Just the symbol and immediate documentationlocal: Include surrounding code in the same filefull: Include dependencies and related files
4. Restart Claude Desktop
Quit and restart Claude. You're ready to explore your C++ code! π
5. Add MCP server to Claude Code
claude mcp add-from-claude-desktop (and make sure clangaroo is checked)
OR
claude mcp add /usr/local/bin/clangaroo --project /path/to/your/cpp/project --warmup --warmup-limit 10 --log-level info --ai-enabled --ai-provider gemini-2.5-flash --ai-cache-days 14 --ai-cost-limit 15.0 --call-hierarchy-depth 10 --ai-analysis-level summary --ai-context-level minimal --name clangaroo --env CLANGAROO_AI_API_KEY=your-google-ai-api-key
π― Features
β‘ Ultra-Fast Navigation: Fast response times for code structure queries
π Smart Symbol Search: Hybrid Tree-sitter + clangd search with automatic fallback
π Deep Code Analysis: Call hierarchies, type hierarchies, and reference tracking
π€ AI-Powered Insights: Documentation summarization, pattern detection, and architectural analysis
πͺ Robust: Works even with compilation errors thanks to Tree-sitter fallback
π Zero Configuration: Just point to a project with
compile_commands.json
π¬ Usage Examples
This is really meant for coding agents like Claude Code more than you, but if you want to use it, you can just talk to your LLM naturally about your code once the MCP server is hooked up:
π οΈ Available Tools
Tool Category | Tools | Description |
π Discovery |
| Find files and symbols in your codebase |
π Navigation |
| Jump to definitions, find references, get type info |
π Call Analysis |
| Trace function relationships |
ποΈ Type Hierarchy |
| Analyze inheritance |
β‘ Structure |
| Fast structural analysis |
π€ AI Features (Optional)
Setup
Get your API key from Google AI Studio
Add to your environment (
bash):export CLANGAROO_AI_API_KEY="your-api-key"
What You Get
π Smart Documentation: Complex C++ docs explained clearly
π Pattern Analysis: Understand why and how functions are called
ποΈ Architecture Insights: Identify design patterns automatically
π‘ Refactoring Tips: Get improvement recommendations
π° Cost Effective: $3-7/month typical usage with smart caching
βοΈ Configuration Reference
Basic Options
--project PATH- Path to C++ project root (required)--log-level LEVEL- Logging verbosity: debug, info, warning, error--timeout SECONDS- LSP request timeout (default: 5.0)
Performance Options
--warmup- Pre-warm the index by opening key files--warmup-limit N- Number of files to warm up (default: 10)--wait-for-index- Wait for clangd indexing to complete--index-timeout SECONDS- Timeout for index wait (default: 300)--index-path PATH- Custom clangd index location
AI Options
--ai-enabled- Enable AI features--ai-provider PROVIDER- AI provider: gemini-2.5-flash or gemini-2.5-flash-lite--ai-api-key KEY- Google AI API key--ai-cache-days DAYS- Cache AI summaries for N days (default: 7)--ai-cost-limit AMOUNT- Monthly cost limit in USD (default: 10.0)--ai-analysis-level LEVEL- Default analysis depth: summary or detailed--ai-context-level LEVEL- Code context depth: minimal, local, or full
Call Hierarchy Options
--call-hierarchy-depth DEPTH- Maximum depth (1-10, default: 3)--call-hierarchy-max-calls NUM- Total call limit (default: 100)--call-hierarchy-per-level NUM- Calls per depth level (default: 25)
π Requirements
Python 3.10+
clangd 16+ (
brew install llvmorapt install clangd)C++ project with
compile_commands.json(Optional) Google AI API key for AI features
π§ Troubleshooting
Check the config file location and JSON syntax
Use absolute paths in the configuration
Restart Claude Desktop completely
Check logs with
--log-level debug
Verify
compile_commands.jsonincludes the filesWait for indexing: add
--wait-for-indexflagTest clangd directly:
clangd --check=file.cpp
Enable warmup:
--warmup --warmup-limit 30Use shared index:
--index-path /shared/clangd-indexReduce call hierarchy depth for large codebases
π License
MIT License - see the file for details.
π Acknowledgments
clangd for C++ language server
Tree-sitter for syntax parsing
MCP for the protocol specification
Google Gemini for AI capabilities