Skip to main content
Glama

Codex MCP Server

by cexll
advanced.md2.11 kB
## Advanced ## Non-interactive / CI mode Run Codex head-less in pipelines. Example GitHub Action step: ```yaml - name: Update changelog via Codex run: | npm install -g @openai/codex export OPENAI_API_KEY="${{ secrets.OPENAI_KEY }}" codex exec --full-auto "update CHANGELOG for next release" ``` ## Tracing / verbose logging Because Codex is written in Rust, it honors the `RUST_LOG` environment variable to configure its logging behavior. The TUI defaults to `RUST_LOG=codex_core=info,codex_tui=info` and log messages are written to `~/.codex/log/codex-tui.log`, so you can leave the following running in a separate terminal to monitor log messages as they are written: ``` tail -F ~/.codex/log/codex-tui.log ``` By comparison, the non-interactive mode (`codex exec`) defaults to `RUST_LOG=error`, but messages are printed inline, so there is no need to monitor a separate file. See the Rust documentation on [`RUST_LOG`](https://docs.rs/env_logger/latest/env_logger/#enabling-logging) for more information on the configuration options. ## Model Context Protocol (MCP) The Codex CLI can be configured to leverage MCP servers by defining an [`mcp_servers`](config.md#mcp_servers) section in `~/.codex/config.toml`. It is intended to mirror how tools such as Claude and Cursor define `mcpServers` in their respective JSON config files, though the Codex format is slightly different since it uses TOML rather than JSON, e.g.: ```toml # IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`. [mcp_servers.server-name] command = "npx" args = ["-y", "mcp-server"] env = { "API_KEY" = "value" } ``` > [!TIP] > It is somewhat experimental, but the Codex CLI can also be run as an MCP _server_ via `codex mcp`. If you launch it with an MCP client such as `npx @modelcontextprotocol/inspector codex mcp` and send it a `tools/list` request, you will see that there is only one tool, `codex`, that accepts a grab-bag of inputs, including a catch-all `config` map for anything you might want to override. Feel free to play around with it and provide feedback via GitHub issues.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cexll/codex-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server