Enables the server to clone repositories and manage specific branches to facilitate automated code review and analysis.
Supports interaction with GitHub-hosted repositories for processing source code and executing prompt-based development workflows.
Wraps OpenAI's Codex CLI to perform code analysis, suggest improvements, and execute developer-focused tasks based on repository context.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Agentic Developer MCPAnalyze the src folder in https://github.com/psf/requests for performance improvements"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Agentic Developer MCP
This project wraps OpenAI's Codex CLI as an MCP (Model Context Protocol) server, making it accessible through the TeaBranch/open-responses-server middleware.
This engine may be replaced with OpenCode or Amazon Strands
Requirements
Node 22 (
nvm install 22.15.1 | nvm use 22.15.1) required for Codex
Overview
The setup consists of three main components:
Codex CLI: OpenAI's command-line interface for interacting with Codex.
MCP Wrapper Server: A Node.js Express server that forwards MCP requests to Codex CLI and formats responses as MCP.
open-responses-server: A middleware service that provides Responses API compatibility and MCP support.
Installation
Using Docker (Recommended)
This will start:
Codex MCP wrapper on port 8080
open-responses-server on port 3000
Manual Installation
Usage
You can run the MCP server using either stdio or SSE transport:
Tool Documentation
run_codex
Clones a repository, checks out a specific branch (optional), navigates to a specific folder (optional), and runs Codex with the given request.
Parameters
repository(required): Git repository URLbranch(optional): Git branch to checkoutfolder(optional): Folder within the repository to focus onrequest(required): Codex request/prompt to run
Example
clone_and_write_prompt
Clones a repository, reads the system prompt from .agent/system.md, parses modelId from .agent/agent.json, writes the request to a .prompt file, and invokes the Codex CLI with the extracted model.
Parameters
repository(required): Git repository URLrequest(required): Prompt text to run through Codexfolder(optional, default/): Subfolder within the repository to operate in
Example
MCPS Configuration
Place a mcps.json file under the .agent/ directory to register available MCP tools. Codex will load this configuration automatically.
Example .agent/mcps.json:
Development
This project uses the MCP Python SDK to implement an MCP server. The primary implementation is in mcp_server/server.py.
License
MIT