MCP Server
The MCP Server is an AI service proxy built on Cloudflare Workers that routes requests to multiple AI providers through a unified API.
Core Capabilities
Multi-provider AI integration: Proxy requests to Anthropic Claude and OpenAI through a single endpoint
Edge deployment: Global, low-latency responses via Cloudflare Workers
Type-safe: Implemented in TypeScript with the Hono framework
Cross-origin support: Built-in CORS for web application integration
API Endpoints
GET /health— Check server status and configurationGET /api/provider— Retrieve current AI provider and model configurationPOST /api/mcp— Send AI service requests with context, query, and options
Available Tools (accessible via /api/mcp)
api-client— Handle interactions with external APIsdata-processor— Process and transform data through the AI pipelinefile-handler— Manage file-related operationsmy-tool— Flexible custom message processingexample_tool— General-purpose message processing
Provides global edge deployment for the MCP server, offering low-latency proxying of AI requests through Cloudflare's distributed network
Enables routing requests to OpenAI's models through the MCP server, providing access to OpenAI's AI capabilities via a unified proxy interface
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Serverask Claude to summarize this document about AI ethics"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Server
A modern AI service proxy built with Cloudflare Workers and Hono framework, supporting multiple AI providers including Anthropic Claude and OpenAI.
Features
Multi-provider AI service integration (Anthropic Claude, OpenAI)
Built on Cloudflare Workers for global edge deployment
Fast and efficient request handling with Hono framework
Type-safe implementation with TypeScript
CORS support for cross-origin requests
Health check and provider info endpoints
Related MCP server: Remote MCP Server
Prerequisites
Node.js (LTS version recommended)
npm or pnpm package manager
Cloudflare account for deployment
API keys for supported AI providers
Installation
Clone the repository
Install dependencies:
pnpm installEnvironment Setup
Copy the example environment file:
cp .env.example .envConfigure your environment variables in
.envwith your API keys and preferences
Development
Start the development server:
pnpm run devThe server will start in development mode with hot reloading enabled.
Deployment
Deploy to Cloudflare Workers:
pnpm run deployAPI Endpoints
Health Check
GET /healthReturns server status and configuration
Provider Info
GET /api/providerReturns current AI provider and model configuration
MCP API
POST /api/mcpMain endpoint for AI service requests
Accepts JSON payload with context, query, and options
Project Structure
├── src/
│ ├── controllers/ # Request handlers
│ ├── models/ # Type definitions
│ ├── services/ # AI service implementations
│ └── index.ts # Main application entry
├── public/ # Static assets
└── wrangler.jsonc # Cloudflare Workers configurationLicense
MIT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/quang-pham-dev/my-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server