Skip to main content
Glama

Sourcegraph MCP Server

by divar-ir

Sourcegraph MCP Server

A Model Context Protocol (MCP) server that provides AI-enhanced code search capabilities using Sourcegraph.

Table of Contents

Overview

This MCP server integrates with Sourcegraph, a universal code search platform that enables searching across multiple repositories and codebases. It provides powerful search capabilities with advanced query syntax, making it ideal for AI assistants that need to find and understand code patterns across large codebases.

Features

  • Code Search: Search across codebases using Sourcegraph's powerful query language
  • Advanced Query Language: Support for regex patterns, file filters, language filters, and boolean operators
  • Repository Discovery: Find repositories by name and explore their structure
  • Content Fetching: Browse repository files and directories
  • AI Integration: Designed for LLM integration with guided search prompts

Prerequisites

  • Sourcegraph Instance: Access to a Sourcegraph instance (either sourcegraph.com or self-hosted)
  • Python 3.10+: Required for running the MCP server
  • UV (optional): Modern Python package manager for easier dependency management

Installation

# Install dependencies uv sync # Run the server uv run sourcegraph-mcp

Using pip

# Create virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install package pip install -e . # Run the server python -m src.main

Using Docker

# Build the image docker build -t sourcegraph-mcp . # Run the container with default ports docker run -p 8000:8000 -p 8080:8080 \ -e SRC_ENDPOINT=https://sourcegraph.com \ -e SRC_ACCESS_TOKEN=your-token \ sourcegraph-mcp # Or run with custom ports docker run -p 9000:9000 -p 9080:9080 \ -e SRC_ENDPOINT=https://sourcegraph.com \ -e SRC_ACCESS_TOKEN=your-token \ -e MCP_SSE_PORT=9000 \ -e MCP_STREAMABLE_HTTP_PORT=9080 \ sourcegraph-mcp

Configuration

Required Environment Variables

  • SRC_ENDPOINT: Sourcegraph instance URL (e.g., https://sourcegraph.com)

Optional Environment Variables

  • SRC_ACCESS_TOKEN: Authentication token for private Sourcegraph instances
  • MCP_SSE_PORT: SSE server port (default: 8000)
  • MCP_STREAMABLE_HTTP_PORT: HTTP server port (default: 8080)

Usage with AI Tools

Cursor

After running the MCP server, add the following to your .cursor/mcp.json file:

{ "mcpServers": { "sourcegraph": { "url": "http://localhost:8080/sourcegraph/mcp/" } } }

MCP Tools

This server provides three powerful tools for AI assistants:

Search across codebases using Sourcegraph's advanced query syntax with support for regex, language filters, and boolean operators.

📖 search_prompt_guide

Generate a context-aware guide for constructing effective search queries based on your specific objective.

📂 fetch_content

Retrieve file contents or explore directory structures from repositories.

Development

Linting and Formatting

# Check code style uv run ruff check src/ # Format code uv run ruff format src/
-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Provides AI-enhanced code search capabilities by integrating with Sourcegraph, allowing AI assistants to search across multiple repositories and codebases with advanced query syntax.

  1. Table of Contents
    1. Overview
      1. Features
        1. Prerequisites
          1. Installation
            1. Using UV (recommended)
            2. Using pip
            3. Using Docker
          2. Configuration
            1. Required Environment Variables
            2. Optional Environment Variables
          3. Usage with AI Tools
            1. Cursor
          4. MCP Tools
            1. 🔍 search
            2. 📖 search_prompt_guide
            3. 📂 fetch_content
          5. Development
            1. Linting and Formatting

          Related MCP Servers

          • -
            security
            F
            license
            -
            quality
            Provides code generation and completion capabilities using the DeepSeek API, with support for tool chaining and cost optimization.
            Last updated -
            3
            JavaScript
          • -
            security
            A
            license
            -
            quality
            A server that allows AI assistants to search for research papers, read their content, and access related code repositories through the PapersWithCode API.
            Last updated -
            12
            Python
            MIT License
          • -
            security
            F
            license
            -
            quality
            A local server that provides powerful code analysis and search capabilities for software projects, helping AI assistants and development tools understand codebases for tasks like code generation and refactoring.
            Last updated -
            2
            Python
            • Apple
            • Linux
          • -
            security
            A
            license
            -
            quality
            Enables semantic code search across codebases using Qdrant vector database and OpenAI embeddings, allowing users to find code by meaning rather than just keywords through natural language queries.
            Last updated -
            Python
            MIT License

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/divar-ir/sourcegraph-mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server