Offers instructions for running Qdrant vector database via Docker containers to support the semantic search functionality
Provides a Node.js library interface for programmatic integration, with specific adapters and APIs for Node.js environments
Leverages Ollama for embedding models to power semantic code search, with specific support for models like nomic-embed-text
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@AutoDev Codebase MCP Serversearch for user authentication functions in the current project"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
@autodev/codebase
A vector embedding-based code semantic search tool with MCP server and multi-model integration. Can be used as a pure CLI tool. Supports Ollama for fully local embedding and reranking, enabling complete offline operation and privacy protection for your code repository.
🚀 Features
🔍 Semantic Code Search: Vector-based search using advanced embedding models
🔗 Call Graph Analysis: Trace function call relationships and execution paths
🌐 MCP Server: HTTP-based MCP server with SSE and stdio adapters
💻 Pure CLI Tool: Standalone command-line interface without GUI dependencies
⚙️ Layered Configuration: CLI, project, and global config management
🎯 Advanced Path Filtering: Glob patterns with brace expansion and exclusions
🌲 Tree-sitter Parsing: Support for 40+ programming languages
💾 Qdrant Integration: High-performance vector database
🔄 Multiple Providers: OpenAI, Ollama, Jina, Gemini, Mistral, OpenRouter, Vercel
📊 Real-time Watching: Automatic index updates
⚡ Batch Processing: Efficient parallel processing
📝 Code Outline Extraction: Generate structured code outlines with AI summaries
💨 Dependency Analysis Cache: Intelligent caching for 10-50x faster re-analysis
Related MCP server: code-index-mcp
📦 Installation
1. Dependencies
2. Qdrant
3. Install
🛠️ Quick Start
📋 Commands
📝 Code Outlines
🔗 Call Graph Analysis
Query Patterns:
Exact match:
--query="functionName"or--query="*ClassName.methodName"Wildcards:
*(any characters),?(single character)Examples:
--query="get*",--query="*User*",--query="*.*.get*"
Single function:
--query="main"- Shows call tree (upward + downward)Default depth: 3 (avoids excessive output)
Multiple functions:
--query="main,helper"- Analyzes connection paths between functionsDefault depth: 10 (deeper search needed for path finding)
Supported Languages:
TypeScript/JavaScript (.ts, .tsx, .js, .jsx)
Python (.py)
Java (.java)
C/C++ (.c, .h, .cpp, .cc, .cxx, .hpp, .hxx, .c++)
C# (.cs)
Rust (.rs)
Go (.go)
🔍 Indexing & Search
🌐 MCP Server
⚙️ Configuration
🚀 Advanced Features
🔍 LLM-Powered Search Reranking
Enable LLM reranking to dramatically improve search relevance:
Benefits:
🎯 Higher precision: LLM understands semantic relevance beyond vector similarity
📊 Smart scoring: Results are reranked on a 0-10 scale based on query relevance
⚡ Batch processing: Efficiently handles large result sets with configurable batch sizes
🎛️ Threshold control: Filter results with
rerankerMinScoreto keep only high-quality matches
Path Filtering & Export
Path Filtering & Export
⚙️ Configuration
Config Layers (Priority Order)
CLI Arguments - Runtime parameters (
--path,--config,--log-level,--force, etc.)Project Config -
./autodev-config.json(or custom path via--config)Global Config -
~/.autodev-cache/autodev-config.jsonBuilt-in Defaults - Fallback values
Note: CLI arguments provide runtime override for paths, logging, and operational behavior. For persistent configuration (embedderProvider, API keys, search parameters), use config --set to save to config files.
Common Config Examples
Ollama:
OpenAI:
OpenAI-Compatible:
Key Configuration Options
Category | Options | Description |
Embedding |
| Provider and model settings |
API Keys |
| Authentication |
Vector Store |
| Qdrant connection |
Search |
| Search behavior |
Reranker |
| Result reranking |
Summarizer |
| AI summary generation |
Key CLI Arguments:
index- Index the codebasesearch <query>- Search the codebase (required positional argument)outline <pattern>- Extract code outlines (supports glob patterns)call- Analyze function call relationships and dependency graphsstdio- Start stdio adapter for MCPconfig- Manage configuration (use with --get or --set)--serve- Start MCP HTTP server (use withindexcommand)--summarize- Generate AI summaries for code outlines--dry-run- Preview operations before execution--title- Show only file-level summaries--clear-summarize-cache- Clear all summary caches--path,--demo,--force- Common options--limit/-l <number>- Maximum number of search results (default: from config, max 50)--min-score/-S <number>- Minimum similarity score for search results (0-1, default: from config)--query <patterns>- Query patterns for call graph analysis (comma-separated)--viz <file>- Export full dependency data for visualization (cannot use with --query)--open- Open interactive graph viewer--depth <number>- Set analysis depth for call graphs--help- Show all available options
Configuration Commands:
For complete configuration reference, see CONFIG.md.
🔌 MCP Integration
HTTP Streamable Mode (Recommended)
IDE Config:
Stdio Adapter
IDE Config:
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request or open an Issue on GitHub.
📄 License
This project is licensed under the MIT License.
🙏 Acknowledgments
This project is a fork and derivative work based on Roo Code. We've built upon their excellent foundation to create this specialized codebase analysis tool with enhanced features and MCP server capabilities.
🌟 If you find this tool helpful, please give us a
Made with ❤️ for the developer community