Search for:
Why this server?
This server is designed to interact with OpenAI assistants, which could be used to expose a GPT action as an MCP server, allowing for creation and interaction with OpenAI assistants through the Model Context Protocol.
Why this server?
This server acts as a proxy for any API with an OpenAPI v3.1 specification. If the GPT action is described with an OpenAPI spec (e.g. a swagger file), then this is a great fit as it will expose the API as an MCP tool.
Why this server?
This server provides a way to send requests to various AI models including OpenAI. If you are looking for a direct way to use an OpenAI model through MCP this would be a good fit
Why this server?
Similar to the Unichat MCP Server, this Typescript implementation allows for sending requests to various AI models including OpenAI. This server supports both STDIO and SSE transport.
Why this server?
This server provides a way to use OpenAI's models directly from Claude, which can be a way to use a GPT model via the MCP.
Why this server?
This server facilitates interaction with AWS Bedrock-enabled tools using the Model Context Protocol. If the GPT action is deployed as a Bedrock tool this can be used.
Why this server?
This server enables interaction with remote MCP servers using SSE instead of STDIO. If the GPT action was exposed as a different MCP server, then this can be used to make that available
Why this server?
This is an MCP server to share contexts across different AI models. If the GPT action is part of an agentic setup then this could be helpful.
Why this server?
This MCP server uses the Pera1 service to extract code from GitHub repositories. If the GPT action is part of a codebase on github this might be helpful to quickly expose its functionality.
Why this server?
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.