Skip to main content
Glama

Codex LSP Bridge (MCP)

Give Codex CLI IDE-grade semantic navigation by exposing Language Server Protocol (LSP) features as MCP tools.

What you get

  • Go to definition / find references / hover type info / symbols

  • Works across Python, Rust, C/C++, TypeScript/JavaScript/Node, React (JSX/TSX), HTML, CSS/SCSS/Less

  • One endpoint: http://127.0.0.1:8000/mcp (Streamable HTTP) or stdio

Quickstart

1) Install this bridge

# from the repo root pip install -e .

2) Install language servers

On Debian/Ubuntu you can run ./install.sh to install all supported language servers.

Python (pick one)

pip install basedpyright # provides basedpyright-langserver # OR npm i -g pyright # provides pyright-langserver

Rust

rustup component add rust-analyzer

C/C++

# install clangd via your platform (llvm package) clangd --version

TypeScript/JavaScript/Node/React

npm i -g typescript typescript-language-server

HTML/CSS

npm i -g vscode-langservers-extracted

3) Run the server

codex-lsp-bridge serve --config ./config/default.toml --host 127.0.0.1 --port 8000

4) Connect Codex CLI

Edit ~/.codex/config.toml (HTTP):

experimental_use_rmcp_client = true [mcp_servers.lsp_bridge] url = "http://127.0.0.1:8000/mcp"

Or use the Codex CLI helper:

codex mcp add lsp_bridge --url http://127.0.0.1:8000/mcp

Using it effectively

Prompt pattern:

When navigating or refactoring code, always use LSP tools (go_to_definition, find_references, hover, document_symbols) instead of text search.

Example:

codex "Find all references to UserRepository and rename it to AccountRepository. Use LSP rename_symbol first."

Note: line/column positions are 0-indexed.

Install language servers (Debian/Ubuntu)

If you are on Debian/Ubuntu, you can use the provided script:

./install.sh

This installs:

  • basedpyright (Python)

  • rust-analyzer (Rust)

  • clangd (C/C++)

  • typescript-language-server + typescript (TS/JS/React)

  • vscode-html-language-server + vscode-css-language-server (HTML/CSS)

Note: some commands may land in ~/.local/bin or ~/.cargo/bin. Ensure those are on PATH for the user that runs the bridge.

Config

See config/default.toml. It maps file extensions to language server commands.

You can point the bridge to a config file via:

export CODEX_LSP_BRIDGE_CONFIG=/path/to/config.toml

MCP setup (HTTP vs stdio)

HTTP (bridge runs separately):

experimental_use_rmcp_client = true [mcp_servers.lsp_bridge] url = "http://127.0.0.1:8000/mcp"

Stdio (Codex spawns the bridge, no HTTP):

[mcp_servers.lsp_bridge] command = "/absolute/path/to/codex-lsp-bridge" args = ["serve", "--transport", "stdio", "--config", "/absolute/path/to/config/default.toml"]

Multi-project behavior

  • Each tool call includes a file_path.

  • The bridge detects the workspace root by walking upward from that file and looking for markers like .git, package.json, Cargo.toml, etc.

  • It starts an LSP per (language, workspace_root) so multiple repos do not contaminate each other.

  • Absolute paths are recommended; relative paths resolve against the bridge's default_root.

System prompt snippet

See SYSTEM_PROMPT_SNIPPET.md for a short snippet you can append to your AI system prompt so it always prefers LSP tools.

Security notes

  • Local is safest: the bridge reads your repo files and launches local binaries.

  • If you deploy remotely, only do so where the server has access to the same repo and protect it with auth.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/CesarPetrescu/lsp-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server