Provides global edge deployment for the MCP server, offering low-latency proxying of AI requests through Cloudflare's distributed network
Enables routing requests to OpenAI's models through the MCP server, providing access to OpenAI's AI capabilities via a unified proxy interface
MCP Server
A modern AI service proxy built with Cloudflare Workers and Hono framework, supporting multiple AI providers including Anthropic Claude and OpenAI.
Features
Multi-provider AI service integration (Anthropic Claude, OpenAI)
Built on Cloudflare Workers for global edge deployment
Fast and efficient request handling with Hono framework
Type-safe implementation with TypeScript
CORS support for cross-origin requests
Health check and provider info endpoints
Related MCP server: Remote MCP Server
Prerequisites
Node.js (LTS version recommended)
npm or pnpm package manager
Cloudflare account for deployment
API keys for supported AI providers
Installation
Clone the repository
Install dependencies:
Environment Setup
Copy the example environment file:
Configure your environment variables in
.envwith your API keys and preferences
Development
Start the development server:
The server will start in development mode with hot reloading enabled.
Deployment
Deploy to Cloudflare Workers:
API Endpoints
Health Check
GET /healthReturns server status and configuration
Provider Info
GET /api/providerReturns current AI provider and model configuration
MCP API
POST /api/mcpMain endpoint for AI service requests
Accepts JSON payload with context, query, and options
Project Structure
License
MIT