Skip to main content
Glama

AutoDev Codebase MCP Server

by anrgct

@autodev/codebase

A platform-agnostic code analysis library with semantic search capabilities and MCP (Model Context Protocol) server support. This library provides intelligent code indexing, vector-based semantic search, and can be integrated into various development tools and IDEs.

🚀 Features

  • Semantic Code Search: Vector-based code search using embeddings
  • MCP Server Support: HTTP-based MCP server for IDE integration
  • Terminal UI: Interactive CLI with rich terminal interface
  • Tree-sitter Parsing: Advanced code parsing and analysis
  • Vector Storage: Qdrant vector database integration
  • Flexible Embedding: Support for various embedding models via Ollama

📦 Installation

1. Install and Start Ollama

# Install Ollama (macOS) brew install ollama # Start Ollama service ollama serve # In a new terminal, pull the embedding model ollama pull nomic-embed-text

2. Install and Start Qdrant

Start Qdrant using Docker:

# Start Qdrant container docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant

Or download and run Qdrant directly:

# Download and run Qdrant wget https://github.com/qdrant/qdrant/releases/latest/download/qdrant-x86_64-unknown-linux-gnu.tar.gz tar -xzf qdrant-x86_64-unknown-linux-gnu.tar.gz ./qdrant

3. Verify Services Are Running

# Check Ollama curl http://localhost:11434/api/tags # Check Qdrant curl http://localhost:6333/collections

4. Install project locally

git clone https://github.com/anrgct/autodev-codebase cd autodev-codebase npm install npm run build npm link

🛠️ Usage

Command Line Interface

The CLI provides two main modes:

1. Interactive TUI Mode (Default)
# Basic usage: index your current folder as the codebase. # Be cautious when running this command if you have a large number of files. codebase # With custom options codebase --demo # Create a local demo directory and test the indexing service, recommend for setup codebase --path=/my/project codebase --path=/my/project --log-level=info
# Start long-running MCP server cd /my/project codebase mcp-server # With custom configuration codebase mcp-server --port=3001 --host=localhost codebase mcp-server --path=/workspace --port=3002

IDE Integration (Cursor/Claude)

Configure your IDE to connect to the MCP server:

{ "mcpServers": { "codebase": { "url": "http://localhost:3001/sse" } } }

Library Usage

Node.js Usage
import { createNodeDependencies } from '@autodev/codebase/adapters/nodejs' import { CodeIndexManager } from '@autodev/codebase' const deps = createNodeDependencies({ workspacePath: '/path/to/project', storageOptions: { /* ... */ }, loggerOptions: { /* ... */ }, configOptions: { /* ... */ } }) const manager = CodeIndexManager.getInstance(deps) await manager.initialize() await manager.startIndexing()

🔧 CLI Options

Global Options

  • --path=<path> - Workspace path (default: current directory)
  • --demo - Create demo files in workspace
  • --ollama-url=<url> - Ollama API URL (default: http://localhost:11434)
  • --qdrant-url=<url> - Qdrant vector DB URL (default: http://localhost:6333)
  • --model=<model> - Embedding model (default: nomic-embed-text)
  • --config=<path> - Config file path
  • --storage=<path> - Storage directory path
  • --cache=<path> - Cache directory path
  • --log-level=<level> - Log level: error|warn|info|debug (default: error)
  • --help, -h - Show help

MCP Server Options

  • --port=<port> - HTTP server port (default: 3001)
  • --host=<host> - HTTP server host (default: localhost)

🌐 MCP Server Features

Web Interface

  • Home Page: http://localhost:3001 - Server status and configuration
  • Health Check: http://localhost:3001/health - JSON status endpoint
  • MCP Endpoint: http://localhost:3001/sse - SSE/HTTP MCP protocol endpoint

Available MCP Tools

  • search_codebase - Semantic search through your codebase
    • Parameters: query (string), limit (number), filters (object)
    • Returns: Formatted search results with file paths, scores, and code blocks
  • get_search_stats - Get indexing status and statistics
  • configure_search - Configure search parameters at runtime

Scripts

# Development mode with demo files npm run dev # Build for production npm run build # Type checking npm run type-check # Run TUI demo npm run demo-tui # Start MCP server demo npm run mcp-server

💡 Why Use MCP Server Mode?

Problems Solved

  • ❌ Repeated Indexing: Every IDE connection re-indexes, wasting time and resources
  • ❌ Complex Configuration: Each project needs different path parameters in IDE
  • ❌ Resource Waste: Multiple IDE windows start multiple server instances

Benefits

  • ✅ One-time Indexing: Server runs long-term, index persists
  • ✅ Simplified Configuration: Universal IDE configuration, no project-specific paths
  • ✅ Resource Efficiency: One server instance per project
  • ✅ Better Developer Experience: Start server in project directory intuitively
  • ✅ Backward Compatible: Still supports traditional per-connection mode
  • ✅ Web Interface: Status monitoring and configuration help
  • ✅ Dual Mode: Can run both TUI and MCP server simultaneously

This is a platform-agnostic library extracted from the roo-code VSCode plugin.

📚 Examples

See the examples/ directory for complete usage examples:

  • nodejs-usage.ts - Node.js integration examples
  • run-demo-tui.tsx - TUI demo application

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    A smart code retrieval tool based on Model Context Protocol that provides efficient and accurate code repository search capabilities for large language models.
    Last updated -
    Python
  • A
    security
    A
    license
    A
    quality
    A Model Context Protocol (MCP) server that helps large language models index, search, and analyze code repositories with minimal setup
    Last updated -
    11
    57
    Python
    MIT License
    • Apple
    • Linux
  • -
    security
    A
    license
    -
    quality
    A Model Context Protocol server that enables semantic search capabilities by providing tools to manage Qdrant vector database collections, process and embed documents using various embedding services, and perform semantic searches across vector embeddings.
    Last updated -
    89
    TypeScript
    MIT License
  • A
    security
    A
    license
    A
    quality
    A flexible Model Context Protocol server that makes documentation or codebases searchable by AI assistants, allowing users to chat with code or docs by simply pointing to a git repository or folder.
    Last updated -
    1
    22
    36
    JavaScript
    MIT License

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/anrgct/autodev-codebase'

If you have feedback or need assistance with the MCP directory API, please join our Discord server