MCP Coding Assistant with support for OpenAI + other LLM Providers
A powerful Python recreation of Claude Code with enhanced real-time visualization, cost management, and Model Context Protocol (MCP) server capabilities. This tool provides a natural language interface for software development tasks with support for multiple LLM providers.
Key Features
- Multi-Provider Support: Works with OpenAI, Anthropic, and other LLM providers
- Model Context Protocol Integration:
- Run as an MCP server for use with Claude Desktop and other clients
- Connect to any MCP server with the built-in MCP client
- Multi-agent synchronization for complex problem solving
- Real-Time Tool Visualization: See tool execution progress and results in real-time
- Cost Management: Track token usage and expenses with budget controls
- Comprehensive Tool Suite: File operations, search, command execution, and more
- Enhanced UI: Rich terminal interface with progress indicators and syntax highlighting
- Context Optimization: Smart conversation compaction and memory management
- Agent Coordination: Specialized agents with different roles can collaborate on tasks
Installation
- Clone this repository
- Install dependencies:
- Create a
.env
file with your API keys:
Usage
CLI Mode
Run the CLI with the default provider (determined from available API keys):
Specify a provider and model:
Set a budget limit to manage costs:
MCP Server Mode
Run as a Model Context Protocol server:
Start in development mode with the MCP Inspector:
Configure host and port:
Specify additional dependencies:
Load environment variables from file:
MCP Client Mode
Connect to an MCP server using Claude as the reasoning engine:
Specify a Claude model:
Try the included example server:
Multi-Agent MCP Mode
Launch a multi-agent client with synchronized agents:
Use a custom agent configuration file:
Example with the echo server:
Available Tools
- View: Read files with optional line limits
- Edit: Modify files with precise text replacement
- Replace: Create or overwrite files
- GlobTool: Find files by pattern matching
- GrepTool: Search file contents using regex
- LS: List directory contents
- Bash: Execute shell commands
Chat Commands
- /help: Show available commands
- /compact: Compress conversation history to save tokens
- /version: Show version information
- /providers: List available LLM providers
- /cost: Show cost and usage information
- /budget [amount]: Set a budget limit
- /quit, /exit: Exit the application
Architecture
Claude Code Python Edition is built with a modular architecture:
Using with Model Context Protocol
Using Claude Code as an MCP Server
Once the MCP server is running, you can connect to it from Claude Desktop or other MCP-compatible clients:
- Install and run the MCP server:
- Open the configuration page in your browser:
- Follow the instructions to configure Claude Desktop, including:
- Copy the JSON configuration
- Download the auto-configured JSON file
- Step-by-step setup instructions
Using Claude Code as an MCP Client
To connect to any MCP server using Claude Code:
- Ensure you have your Anthropic API key in the environment or .env file
- Start the MCP server you want to connect to
- Connect using the MCP client:
- Type queries in the interactive chat interface
Using Multi-Agent Mode
For complex tasks, the multi-agent mode allows multiple specialized agents to collaborate:
- Create an agent configuration file or use the provided example
- Start your MCP server
- Launch the multi-agent client:
- Use the command interface to interact with multiple agents:
- Type a message to broadcast to all agents
- Use
/talk Agent_Name message
for direct communication - Use
/agents
to see all available agents - Use
/history
to view the conversation history
Contributing
- Fork the repository
- Create a feature branch
- Implement your changes with tests
- Submit a pull request
License
MIT
Acknowledgments
This project is inspired by Anthropic's Claude Code CLI tool, reimplemented in Python with additional features for enhanced visibility, cost management, and MCP server capabilities.# OpenAI Code Assistant
A powerful command-line and API-based coding assistant that uses OpenAI APIs with function calling and streaming.
Features
- Interactive CLI for coding assistance
- Web API for integration with other applications
- Model Context Protocol (MCP) server implementation
- Replication support for high availability
- Tool-based architecture for extensibility
- Reinforcement learning for tool optimization
- Web client for browser-based interaction
Installation
- Clone the repository
- Install dependencies:
- Set your OpenAI API key:
Usage
CLI Mode
Run the assistant in interactive CLI mode:
Options:
--model
,-m
: Specify the model to use (default: gpt-4o)--temperature
,-t
: Set temperature for response generation (default: 0)--verbose
,-v
: Enable verbose output with additional information--enable-rl/--disable-rl
: Enable/disable reinforcement learning for tool optimization--rl-update
: Manually trigger an update of the RL model
API Server Mode
Run the assistant as an API server:
Options:
--host
: Host address to bind to (default: 127.0.0.1)--port
,-p
: Port to listen on (default: 8000)--workers
,-w
: Number of worker processes (default: 1)--enable-replication
: Enable replication across instances--primary/--secondary
: Whether this is a primary or secondary instance--peer
: Peer instances to replicate with (host), can be specified multiple times
MCP Server Mode
Run the assistant as a Model Context Protocol (MCP) server:
Options:
--host
: Host address to bind to (default: 127.0.0.1)--port
,-p
: Port to listen on (default: 8000)--dev
: Enable development mode with additional logging--dependencies
: Additional Python dependencies to install--env-file
: Path to .env file with environment variables
MCP Client Mode
Connect to an MCP server using the assistant as the reasoning engine:
Options:
--model
,-m
: Model to use for reasoning (default: gpt-4o)--host
: Host address for the MCP server (default: 127.0.0.1)--port
,-p
: Port for the MCP server (default: 8000)
Deployment Script
For easier deployment, use the provided script:
To enable replication:
Web Client
To use the web client, open web-client.html
in your browser. Make sure the API server is running.
API Endpoints
Standard API Endpoints
POST /conversation
: Create a new conversationPOST /conversation/{conversation_id}/message
: Send a message to a conversationPOST /conversation/{conversation_id}/message/stream
: Stream a message responseGET /conversation/{conversation_id}
: Get conversation detailsDELETE /conversation/{conversation_id}
: Delete a conversationGET /health
: Health check endpoint
MCP Protocol Endpoints
GET /
: Health check (MCP protocol)POST /context
: Get context for a prompt templateGET /prompts
: List available prompt templatesGET /prompts/{prompt_id}
: Get a specific prompt templatePOST /prompts
: Create a new prompt templatePUT /prompts/{prompt_id}
: Update an existing prompt templateDELETE /prompts/{prompt_id}
: Delete a prompt template
Replication
The replication system allows running multiple instances of the assistant with synchronized state. This provides:
- High availability
- Load balancing
- Fault tolerance
To set up replication:
- Start a primary instance with
--enable-replication
- Start secondary instances with
--enable-replication --secondary --peer [primary-host:port]
Tools
The assistant includes various tools:
- Weather: Get current weather for a location
- View: Read files from the filesystem
- Edit: Edit files
- Replace: Write files
- Bash: Execute bash commands
- GlobTool: File pattern matching
- GrepTool: Content search
- LS: List directory contents
- JinaSearch: Web search using Jina.ai
- JinaFactCheck: Fact checking using Jina.ai
- JinaReadURL: Read and summarize webpages
CLI Commands
/help
: Show help message/compact
: Compact the conversation to reduce token usage/status
: Show token usage and session information/config
: Show current configuration settings/rl-status
: Show RL tool optimizer status (if enabled)/rl-update
: Update the RL model manually (if enabled)/rl-stats
: Show tool usage statistics (if enabled)
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
CLI 상호작용, 웹 API 통합, 도구 기반 아키텍처와 같은 기능을 통해 코딩 지원을 위해 OpenAI API와 MCP 클라이언트 간의 연결을 가능하게 하는 모델 컨텍스트 프로토콜 서버 구현입니다.
Related Resources
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.Last updated -16764MIT License
- -securityAlicense-qualityA simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.Last updated -36MIT License
- AsecurityFlicenseAqualityAn all-in-one Model Context Protocol (MCP) server that connects your coding AI to numerous databases, data warehouses, data pipelines, and cloud services, streamlining development workflow through seamless integrations.Last updated -3
- -securityAlicense-qualityA guide for implementing Model Context Protocol (MCP) servers that provide AI models with external tools like web search, text manipulation, and mathematical operations.Last updated -8MIT License