Skip to main content
Glama

Codex MCP Server

by cexll

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Schema

Prompts

Interactive templates invoked by user choice

NameDescription
ask-codexExecute Codex CLI with optional changeMode
batch-codexExecute multiple atomic Codex tasks in batch mode for efficient automation
pingEcho test message with structured response.
Helpreceive help information
versionGet version information for Codex CLI and MCP server
brainstormCreate structured brainstorming with chosen methodology and analysis
fetch-chunkFetch the next chunk of a response
timeout-testTest the timeout prevention system by running a long operation

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
ask-codex

Execute Codex CLI with file analysis (@syntax), model selection, and safety controls. Supports changeMode.

batch-codex

Delegate multiple atomic tasks to Codex for batch processing. Ideal for repetitive operations, mass refactoring, and automated code transformations

ping

Echo

Help

receive help information

version

Display version and system information

brainstorm

Generate creative ideas using structured frameworks with domain context and feasibility analysis.

fetch-chunk

Retrieves cached chunks from a changeMode response. Use this to get subsequent chunks after receiving a partial changeMode response.

timeout-test

Test timeout prevention by running for a specified duration

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cexll/codex-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server