Fully leverages Cloudflare Workers infrastructure to run the MCP server, enabling streamable HTTP MCP protocol compatibility with any MCP clients.
Utilizes Drizzle ORM for database operations, working with Cloudflare D1 database.
Continuously fetches from OpenAI repositories to keep knowledge up to date with the latest developments.
Uses Hono web framework to build the MCP server endpoints and web interface.
Mentioned as a way to expose the MCP server publicly for client connections.
Provides deep knowledge about the OpenAI API, functioning as a ChatGPT Deep Research connector and offering search and fetch tools to access up-to-date information about OpenAI SDKs and community forums.
OpenAI SDK Knowledge MCP Server (unofficial)
An MCP server that knows the OpenAI API inside and out. 100% TypeScript built with OpenAI Agents SDK, Hono, Cloudflare Workers, and Drizzle ORM. Powered by RAG and ready to answer your technical questions.
Developer Highlights
Cloudflare stack: Fully leverages Cloudflare Workers, Queues, D1, Vectorize, and AI Gateway.
Streamable HTTP MPC Server: Compatible with any MPC clients.
ChatGPT Deep Research connector: Meets ChatGPT's Deep Research connector requirements.
Always updated: Continuously fetches OpenAI repos and community forums for new content.
Rapidly built with AI: Developed hand in hand with various AI coding tools.
Related MCP server: Deep Thinking Assistant
Streamable HTTP MCP Server
Use a publicly accessible URL (e.g., ngrok, Cloudflare Tunnel) to serve the endpoints for MCP clients. You can generate the token on the top page:
For example, you can add this MCP server to Cursor:
Not only Cursor—you can use this MCP server with any other tools supporting MCP server connections.
OpenAI Responses API's Hosted MCP Server Tool
You can pass https://openai-sdk-knowledge.org/mcp along with a valid API token:
Then, you can call the tool in the conversation with the Responses API agent:
ChatGPT Deep Research MCP Connector
Also, for ChatGPT Deep Research customer connector, use the same URL. When the ChatGPT server accesses this app's MCP server endpoint, it returns search and fetch tools as well (see the documentation for details).
Run Locally
You can access http://localhost:8787 and see how it works.
Requirements: Node.js 22+ and API keys (OpenAI, GitHub)
Architecture
This app is essentially a simple web app that runs on Cloudflare Workers. The web app provides MCP server protocol compatible endpoints, as well as a web user interface. For the RAG data pipeline, it collects data from sources and generates asynchronous tasks to run and enqueue them into Cloudflare’s Queue.
License
MIT