Fully leverages Cloudflare Workers infrastructure to run the MCP server, enabling streamable HTTP MCP protocol compatibility with any MCP clients.
Utilizes Drizzle ORM for database operations, working with Cloudflare D1 database.
Continuously fetches from OpenAI repositories to keep knowledge up to date with the latest developments.
Uses Hono web framework to build the MCP server endpoints and web interface.
Mentioned as a way to expose the MCP server publicly for client connections.
Provides deep knowledge about the OpenAI API, functioning as a ChatGPT Deep Research connector and offering search and fetch tools to access up-to-date information about OpenAI SDKs and community forums.
OpenAI SDK Knowledge MCP Server (unofficial)
An MCP server that knows the OpenAI API inside and out. 100% TypeScript built with OpenAI Agents SDK, Hono, Cloudflare Workers, and Drizzle ORM. Powered by RAG and ready to answer your technical questions.
Developer Highlights
- Cloudflare stack: Fully leverages Cloudflare Workers, Queues, D1, Vectorize, and AI Gateway.
- Streamable HTTP MPC Server: Compatible with any MPC clients.
- ChatGPT Deep Research connector: Meets ChatGPT's Deep Research connector requirements.
- Always updated: Continuously fetches OpenAI repos and community forums for new content.
- Rapidly built with AI: Developed hand in hand with various AI coding tools.
Streamable HTTP MCP Server
Use a publicly accessible URL (e.g., ngrok, Cloudflare Tunnel) to serve the endpoints for MCP clients. You can generate the token on the top page:
For example, you can add this MCP server to Cursor:
Not only Cursor—you can use this MCP server with any other tools supporting MCP server connections.
OpenAI Responses API's Hosted MCP Server Tool
You can pass https://openai-sdk-knowledge.org/mcp
along with a valid API token:
Then, you can call the tool in the conversation with the Responses API agent:
ChatGPT Deep Research MCP Connector
Also, for ChatGPT Deep Research customer connector, use the same URL. When the ChatGPT server accesses this app's MCP server endpoint, it returns search
and fetch
tools as well (see the documentation for details).
Run Locally
You can access http://localhost:8787
and see how it works.
Requirements: Node.js 22+ and API keys (OpenAI, GitHub)
Architecture
This app is essentially a simple web app that runs on Cloudflare Workers. The web app provides MCP server protocol compatible endpoints, as well as a web user interface. For the RAG data pipeline, it collects data from sources and generates asynchronous tasks to run and enqueue them into Cloudflare’s Queue.
License
MIT
This server cannot be installed
An MCP server that provides deep knowledge about OpenAI APIs and SDKs, enabling users to query technical information through various MCP clients including ChatGPT Deep Research, Cursor, and OpenAI Responses API.
Related MCP Servers
- -securityAlicense-qualityA simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.Last updated -31PythonMIT License
- -securityFlicense-qualityAn OpenAI API-based MCP server that provides deep thinking and analysis capabilities, integrating with AI editor models to deliver comprehensive insights and practical solutions.Last updated -
- -securityAlicense-qualityA server that integrates the MCP library with OpenAI's API, allowing users to interact with various tools, such as the weather tool, through natural language queries.Last updated -PythonMIT License
- -securityFlicense-qualityAn auto-generated MCP server that enables interaction with the OpenAI API, allowing users to access OpenAI's models and capabilities through the Multi-Agent Conversation Protocol.Last updated -Python