Enhances GitHub Copilot in VS Code with CodeAlive's deep code understanding, allowing it to leverage semantic search and project-wide context for improved code suggestions.
Supports direct invocation via Python interpreter as an alternative configuration option for running the MCP server with AI clients.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@CodeAlive MCPexplain how user authentication works in our backend"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
CodeAlive MCP: Deepest Context Engine for your projects (especially for large codebases)
Connect your AI assistant to CodeAlive's powerful code understanding platform in seconds!
This MCP (Model Context Protocol) server enables AI clients like Claude Code, Cursor, Claude Desktop, Continue, VS Code (GitHub Copilot), Cline, Codex, OpenCode, Qwen Code, Gemini CLI, Roo Code, Goose, Kilo Code, Windsurf, Kiro, Qoder, n8n, and Amazon Q Developer to access CodeAlive's advanced semantic code search and codebase interaction features.
What is CodeAlive?
The most accurate and comprehensive Context Engine as a service, optimized for large codebases, powered by advanced GraphRAG and accessible via MCP. It enriches the context for AI agents like Cursor, Claude Code, Codex, etc., making them 35% more efficient and up to 84% faster.
It's like Context7, but for your (large) codebases.
It allows AI-Coding Agents to:
Find relevant code faster with semantic search
Understand the bigger picture beyond isolated files
Provide better answers with full project context
Reduce costs and time by removing guesswork
Related MCP server: Codebase MCP
🛠 Available Tools
Once connected, you'll have access to these powerful tools:
get_data_sources- List your indexed repositories and workspacescodebase_search- Semantic code search across your indexed codebase (main/master branch)codebase_consultant- AI consultant with full project expertise
🎯 Usage Examples
After setup, try these commands with your AI assistant:
"Show me all available repositories" → Uses
get_data_sources"Find authentication code in the user service" → Uses
codebase_search"Explain how the payment flow works in this codebase" → Uses
codebase_consultant
📚 Agent Skill
For an even better experience, install the CodeAlive Agent Skill alongside the MCP server. The MCP server gives your agent access to CodeAlive's tools; the skill teaches it the best workflows and query patterns to use them effectively.
For most agents (Cursor, Copilot, Gemini CLI, Codex, and 30+ others) — install the skill:
npx skills add CodeAlive-AI/codealive-skills@codealive-context-engineFor Claude Code — install the plugin (recommended), which includes the skill plus Claude-specific enhancements:
/plugin marketplace add CodeAlive-AI/codealive-skills
/plugin install codealive@codealive-marketplaceTable of Contents
🚀 Quick Start (Remote)
The fastest way to get started - no installation required! Our remote MCP server at https://mcp.codealive.ai/api provides instant access to CodeAlive's capabilities.
Step 1: Get Your API Key
Sign up at https://app.codealive.ai/
Navigate to MCP & API
Click "+ Create API Key"
Copy your API key immediately - you won't see it again!
Step 2: Choose Your AI Client
Select your preferred AI client below for instant setup:
🚀 Quick Start (Agentic Installation)
You may ask your AI agent to install the CodeAlive MCP server for you.
Copy-Paste the following prompt into your AI agent (remember to insert your API key):
Here is CodeAlive API key: PASTE_YOUR_API_KEY_HERE
Add the CodeAlive MCP server by following the installation guide from the README at https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/README.md
Find the section "AI Client Integrations" and locate your client (Claude Code, Cursor, Gemini CLI, etc.). Each client has specific setup instructions:
- For Gemini CLI: Use the one-command setup with `gemini mcp add`
- For Claude Code: Use `claude mcp add` with the --transport http flag
- For other clients: Follow the configuration snippets provided
Prefer the Remote HTTP option when available. If API key is not provided above, help me issue a CodeAlive API key first.Then allow execution.
Restart your AI agent.
🤖 AI Client Integrations
Option 1: Remote HTTP (Recommended)
claude mcp add --transport http codealive https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"Option 2: Docker (STDIO)
claude mcp add codealive-docker /usr/bin/docker run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.3.0Replace YOUR_API_KEY_HERE with your actual API key.
Option 1: Remote HTTP (Recommended)
Open Cursor → Settings (
Cmd+,orCtrl+,)Navigate to "MCP" in the left panel
Click "Add new MCP server"
Paste this configuration:
{
"mcpServers": {
"codealive": {
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Save and restart Cursor
Option 2: Docker (STDIO)
{
"mcpServers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}OpenAI Codex CLI supports MCP via ~/.codex/config.toml.
~/.codex/config.toml
[mcp_servers.codealive]
command = "docker"
args = ["run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"]Experimental: Streamable HTTP (requires experimental_use_rmcp_client)
Note: Streamable HTTP support requires enabling the experimental Rust MCP client in your Codex configuration.
[mcp_servers.codealive]
url = "https://mcp.codealive.ai/api"
headers = { Authorization = "Bearer YOUR_API_KEY_HERE" }One command setup (complete):
gemini mcp add --transport http secure-http https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"Replace YOUR_API_KEY_HERE with your actual API key. That's it - no config files needed! 🎉
Option 1: Remote HTTP (Recommended)
Create/edit
.continue/config.yamlin your project or~/.continue/config.yamlAdd this configuration:
mcpServers:
- name: CodeAlive
type: streamable-http
url: https://mcp.codealive.ai/api
requestOptions:
headers:
Authorization: "Bearer YOUR_API_KEY_HERE"Restart VS Code
Option 2: Docker (STDIO)
mcpServers:
- name: CodeAlive
type: stdio
command: docker
args:
- run
- --rm
- -i
- -e
- CODEALIVE_API_KEY=YOUR_API_KEY_HERE
- ghcr.io/codealive-ai/codealive-mcp:v0.3.0Option 1: Remote HTTP (Recommended)
Note: VS Code supports both Streamable HTTP and SSE transports, with automatic fallback to SSE if Streamable HTTP fails.
Open Command Palette (
Ctrl+Shift+PorCmd+Shift+P)Run "MCP: Add Server"
Choose "HTTP" server type
Enter this configuration:
{
"servers": {
"codealive": {
"type": "http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Restart VS Code
Option 2: Docker (STDIO)
Create .vscode/mcp.json in your workspace:
{
"servers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}Note: Claude Desktop remote MCP requires OAuth authentication. Use Docker option for Bearer token support.
Docker (STDIO)
Edit your config file:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
Add this configuration:
{
"mcpServers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}Restart Claude Desktop
Option 1: Remote HTTP (Recommended)
Open Cline extension in VS Code
Click the MCP Servers icon to configure
Add this configuration to your MCP settings:
{
"mcpServers": {
"codealive": {
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Save and restart VS Code
Option 2: Docker (STDIO)
{
"mcpServers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}Add CodeAlive as a remote MCP server in your opencode.json.
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"codealive": {
"type": "remote",
"url": "https://mcp.codealive.ai/api",
"enabled": true,
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Qwen Code supports MCP via mcpServers in its settings.json and multiple transports (stdio/SSE/streamable-http). Use streamable-http when available; otherwise use Docker (stdio).
~/.qwen/settings.json
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"url": "https://mcp.codealive.ai/api",
"requestOptions": {
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
}Fallback: Docker (stdio)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": ["run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"]
}
}
}Roo Code reads a JSON settings file similar to Cline.
Global config: mcp_settings.json (Roo) or cline_mcp_settings.json (Cline-style)
Option A — Remote HTTP
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Option B — Docker (STDIO)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}Tip: If your Roo build doesn't honor HTTP headers, use the Docker/STDIO option.
UI path: Settings → MCP Servers → Add → choose Streamable HTTP
Streamable HTTP configuration:
Name:
codealiveEndpoint URL:
https://mcp.codealive.ai/apiHeaders:
Authorization: Bearer YOUR_API_KEY_HERE
Docker (STDIO) alternative:
Add a STDIO extension with:
Command:
dockerArgs:
run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.3.0
UI path: Manage → Integrations → Model Context Protocol (MCP) → Add Server
HTTP
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}STDIO (Docker)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}File: ~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"serverUrl": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Note: Kiro does not yet support remote MCP servers natively. Use the
mcp-remoteworkaround to connect to remote HTTP servers.
Prerequisites:
npm install -g mcp-remoteUI path: Settings → MCP → Add Server
Global file: ~/.kiro/settings/mcp.json
Workspace file: .kiro/settings/mcp.json
Remote HTTP (via mcp-remote workaround)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.codealive.ai/api",
"--header",
"Authorization: Bearer ${CODEALIVE_API_KEY}"
],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}Docker (STDIO)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}UI path: User icon → Qoder Settings → MCP → My Servers → + Add (Agent mode)
SSE (remote HTTP)
{
"mcpServers": {
"codealive": {
"type": "sse",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}STDIO (Docker)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}Q Developer CLI
Config file: ~/.aws/amazonq/mcp.json or workspace .amazonq/mcp.json
HTTP server
{
"mcpServers": {
"codealive": {
"type": "http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}STDIO (Docker)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}Q Developer IDE (VS Code / JetBrains)
Global: ~/.aws/amazonq/agents/default.json
Local (workspace): .aws/amazonq/agents/default.json
Minimal entry (HTTP):
{
"mcpServers": {
"codealive": {
"type": "http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
},
"timeout": 310000
}
}
}Use the IDE UI: Q panel → Chat → tools icon → Add MCP Server → choose http or stdio.
Note: JetBrains AI Assistant requires the
mcp-remoteworkaround for connecting to remote HTTP MCP servers.
Prerequisites:
npm install -g mcp-remoteConfig file: Settings/Preferences → AI Assistant → Model Context Protocol → Configure
Add this configuration:
{
"mcpServers": {
"codealive": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.codealive.ai/api",
"--header",
"Authorization: Bearer ${CODEALIVE_API_KEY}"
],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}For self-hosted deployments, replace the URL:
{
"mcpServers": {
"codealive": {
"command": "npx",
"args": [
"mcp-remote",
"http://your-server:8000/api",
"--header",
"Authorization: Bearer ${CODEALIVE_API_KEY}"
],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}See JetBrains MCP Documentation for more details.
Using AI Agent Node with MCP Tools
Add an AI Agent node to your workflow
Configure the agent with MCP tools:
Server URL: https://mcp.codealive.ai/api Authorization Header: Bearer YOUR_API_KEY_HEREThe server automatically handles n8n's extra parameters (sessionId, action, chatInput, toolCallId)
Use the three available tools:
get_data_sources- List available repositoriescodebase_search- Search code semanticallycodebase_consultant- Ask questions about code
Example Workflow:
Trigger → AI Agent (with CodeAlive MCP tools) → Process ResponseNote: n8n middleware is built-in, so no special configuration is needed. The server will automatically strip n8n's extra parameters before processing tool calls.
🔧 Advanced: Local Development
For developers who want to customize or contribute to the MCP server.
Prerequisites
Python 3.11+
uv (recommended) or pip
Installation
# Clone the repository
git clone https://github.com/CodeAlive-AI/codealive-mcp.git
cd codealive-mcp
# Setup with uv (recommended)
uv venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
uv pip install -e .
# Or setup with pip
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e .Local Server Configuration
Once installed locally, configure your AI client to use the local server:
Claude Code (Local)
claude mcp add codealive-local /path/to/codealive-mcp/.venv/bin/python /path/to/codealive-mcp/src/codealive_mcp_server.py --env CODEALIVE_API_KEY=YOUR_API_KEY_HEREOther Clients (Local)
Replace the Docker command and args with:
{
"command": "/path/to/codealive-mcp/.venv/bin/python",
"args": ["/path/to/codealive-mcp/src/codealive_mcp_server.py"],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}Running HTTP Server Locally
# Start local HTTP server
export CODEALIVE_API_KEY="your_api_key_here"
python src/codealive_mcp_server.py --transport http --host localhost --port 8000
# Test health endpoint
curl http://localhost:8000/healthTesting Your Local Installation
After making changes, quickly verify everything works:
# Quick smoke test (recommended)
make smoke-test
# Or run directly
python smoke_test.py
# With your API key for full testing
CODEALIVE_API_KEY=your_key python smoke_test.py
# Run unit tests
make unit-test
# Run all tests
make testThe smoke test verifies:
Server starts and connects correctly
All tools are registered
Each tool responds appropriately
Parameter validation works
Runs in ~5 seconds
Smithery Installation
Auto-install for Claude Desktop via Smithery:
npx -y @smithery/cli install @CodeAlive-AI/codealive-mcp --client claude🌐 Community Plugins
Gemini CLI — CodeAlive Extension
Repo: https://github.com/akolotov/gemini-cli-codealive-extension
Gemini CLI extension that wires CodeAlive into your terminal with prebuilt slash commands and MCP config. It includes:
GEMINI.mdguidance so Gemini knows how to use CodeAlive tools effectivelySlash commands:
/codealive:chat,/codealive:find,/codealive:searchEasy setup via Gemini CLI's extension system
Install
gemini extensions install https://github.com/akolotov/gemini-cli-codealive-extensionConfigure
# Option 1: .env next to where you run `gemini`
CODEALIVE_API_KEY="your_codealive_api_key_here"
# Option 2: environment variable
export CODEALIVE_API_KEY="your_codealive_api_key_here"
gemini🚢 HTTP Deployment (Self-Hosted & Cloud)
Deploy the MCP server as an HTTP service for team-wide access or integration with self-hosted CodeAlive instances.
Deployment Options
The CodeAlive MCP server can be deployed as an HTTP service using Docker. This allows multiple AI clients to connect to a single shared instance, and enables integration with self-hosted CodeAlive deployments.
Docker Compose (Recommended)
Create a docker-compose.yml file based on our example:
# Download the example
curl -O https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/docker-compose.example.yml
mv docker-compose.example.yml docker-compose.yml
# Edit configuration (see below)
nano docker-compose.yml
# Start the service
docker compose up -d
# Check health
curl http://localhost:8000/healthConfiguration Options:
For CodeAlive Cloud (default):
Remove
CODEALIVE_BASE_URLenvironment variable (uses defaulthttps://app.codealive.ai)Clients must provide their API key via
Authorization: Bearer YOUR_KEYheader
For Self-Hosted CodeAlive:
Set
CODEALIVE_BASE_URLto your CodeAlive instance URL (e.g.,https://codealive.yourcompany.com)Clients must provide their API key via
Authorization: Bearer YOUR_KEYheader
See docker-compose.example.yml for the complete configuration template.
Connecting AI Clients to Your Deployed Instance
Once deployed, configure your AI clients to use your HTTP endpoint:
Claude Code:
claude mcp add --transport http codealive http://your-server:8000/api --header "Authorization: Bearer YOUR_API_KEY_HERE"VS Code:
code --add-mcp "{\"name\":\"codealive\",\"type\":\"http\",\"url\":\"http://your-server:8000/api\",\"headers\":{\"Authorization\":\"Bearer YOUR_API_KEY_HERE\"}}"Cursor / Other Clients:
{
"mcpServers": {
"codealive": {
"url": "http://your-server:8000/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Replace your-server:8000 with your actual deployment URL and port.
🐞 Troubleshooting
Quick Diagnostics
Test the hosted service:
curl https://mcp.codealive.ai/healthCheck your API key:
curl -H "Authorization: Bearer YOUR_API_KEY" https://app.codealive.ai/api/v1/data_sourcesEnable debug logging: Add
--debugto local server args
Common Issues
"Connection refused" → Check internet connection
"401 Unauthorized" → Verify your API key
"No repositories found" → Check API key permissions in CodeAlive dashboard
Client-specific logs → See your AI client's documentation for MCP logs
Getting Help
📧 Email: support@codealive.ai
🐛 Issues: GitHub Issues
📦 Publishing to MCP Registry
For maintainers: see DEPLOYMENT.md for instructions on publishing new versions to the MCP Registry.
📄 License
MIT License - see LICENSE file for details.
Ready to supercharge your AI assistant with deep code understanding?
Get started now →
Appeared in Searches
- A tool for searching code using semantic understanding
- Code intelligence tools for LLMs to understand codebase structure and locate functionality
- AI-powered IDEs for project analysis and development
- Using Qdrant vector database for code indexing
- Finding the Best Memory Compression Policies (MCPs) for Optimizing Limited Context Window in Claude Code