Skip to main content
Glama

Codex MCP Server

by cexll

ask-codex

Execute code analysis and editing tasks using file references with @ syntax, model selection, and safety controls. Supports automated refactoring with structured change tracking.

Instructions

Execute Codex CLI with file analysis (@syntax), model selection, and safety controls. Supports changeMode.

Input Schema

NameRequiredDescriptionDefault
promptYesTask or question. Use @ to include files (e.g., '@largefile.ts explain').
modelNoModel: gpt-5-codex, gpt-5, o3, o4-mini, codex-1, codex-mini-latest, gpt-4.1. Default: gpt-5-codex
sandboxNoQuick automation mode: enables workspace-write + on-failure approval. Alias for fullAuto.
fullAutoNoFull automation mode
approvalPolicyNoApproval: never, on-request, on-failure, untrusted
approvalNoApproval policy: untrusted, on-failure, on-request, never
sandboxModeNoAccess: read-only, workspace-write, danger-full-access
yoloNo⚠️ Bypass all safety (dangerous)
cdNoWorking directory
workingDirNoWorking directory for execution
changeModeNoReturn structured OLD/NEW edits for refactoring
chunkIndexNoChunk index (1-based)
chunkCacheKeyNoCache key for continuation
imageNoOptional image file path(s) to include with the prompt
configNoConfiguration overrides as 'key=value' string or object
profileNoConfiguration profile to use from ~/.codex/config.toml
timeoutNoMaximum execution time in milliseconds (optional)
includeThinkingNoInclude reasoning/thinking section in response
includeMetadataNoInclude configuration metadata in response
searchNoEnable web search by activating web_search_request feature flag. Requires network access - automatically sets sandbox to workspace-write if not specified.
ossNoUse local Ollama server (convenience for -c model_provider=oss). Requires Ollama running locally. Automatically sets sandbox to workspace-write if not specified.
enableFeaturesNoEnable feature flags (repeatable). Equivalent to -c features.<name>=true
disableFeaturesNoDisable feature flags (repeatable). Equivalent to -c features.<name>=false

Input Schema (JSON Schema)

{ "properties": { "approval": { "description": "Approval policy: untrusted, on-failure, on-request, never", "type": "string" }, "approvalPolicy": { "description": "Approval: never, on-request, on-failure, untrusted", "enum": [ "never", "on-request", "on-failure", "untrusted" ], "type": "string" }, "cd": { "description": "Working directory", "type": "string" }, "changeMode": { "default": false, "description": "Return structured OLD/NEW edits for refactoring", "type": "boolean" }, "chunkCacheKey": { "description": "Cache key for continuation", "type": "string" }, "chunkIndex": { "description": "Chunk index (1-based)", "minimum": 1, "type": "number" }, "config": { "anyOf": [ { "type": "string" }, { "additionalProperties": {}, "type": "object" } ], "description": "Configuration overrides as 'key=value' string or object" }, "disableFeatures": { "description": "Disable feature flags (repeatable). Equivalent to -c features.<name>=false", "items": { "type": "string" }, "type": "array" }, "enableFeatures": { "description": "Enable feature flags (repeatable). Equivalent to -c features.<name>=true", "items": { "type": "string" }, "type": "array" }, "fullAuto": { "description": "Full automation mode", "type": "boolean" }, "image": { "anyOf": [ { "type": "string" }, { "items": { "type": "string" }, "type": "array" } ], "description": "Optional image file path(s) to include with the prompt" }, "includeMetadata": { "default": true, "description": "Include configuration metadata in response", "type": "boolean" }, "includeThinking": { "default": true, "description": "Include reasoning/thinking section in response", "type": "boolean" }, "model": { "description": "Model: gpt-5-codex, gpt-5, o3, o4-mini, codex-1, codex-mini-latest, gpt-4.1. Default: gpt-5-codex", "type": "string" }, "oss": { "description": "Use local Ollama server (convenience for -c model_provider=oss). Requires Ollama running locally. Automatically sets sandbox to workspace-write if not specified.", "type": "boolean" }, "profile": { "description": "Configuration profile to use from ~/.codex/config.toml", "type": "string" }, "prompt": { "description": "Task or question. Use @ to include files (e.g., '@largefile.ts explain').", "minLength": 1, "type": "string" }, "sandbox": { "default": false, "description": "Quick automation mode: enables workspace-write + on-failure approval. Alias for fullAuto.", "type": "boolean" }, "sandboxMode": { "description": "Access: read-only, workspace-write, danger-full-access", "enum": [ "read-only", "workspace-write", "danger-full-access" ], "type": "string" }, "search": { "description": "Enable web search by activating web_search_request feature flag. Requires network access - automatically sets sandbox to workspace-write if not specified.", "type": "boolean" }, "timeout": { "description": "Maximum execution time in milliseconds (optional)", "type": "number" }, "workingDir": { "description": "Working directory for execution", "type": "string" }, "yolo": { "description": "⚠️ Bypass all safety (dangerous)", "type": "boolean" } }, "required": [ "prompt" ], "type": "object" }

Other Tools from Codex MCP Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/cexll/codex-mcp-server'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server