any-chat-completions-mcp
local-only server
The server can only run on the client’s local machine because it depends on local resources.
Integrations
Allows sending chat messages to OpenAI's API and receiving responses from models like gpt-4o
Integrates with Perplexity's API to send chat messages and receive responses from models like llama-3.1-sonar-small-128k-online
any-chat-completions-mcp MCP Server
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
This implements the Model Context Protocol Server. Learn more: https://modelcontextprotocol.io
This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API.
It has one tool, chat
which relays a question to a configured AI Chat Provider.
Development
Install dependencies:
Build the server:
For development with auto-rebuild:
Installation
To add OpenAI to Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:
With these three, you'll see a tool for each in the Claude Desktop Home:
And then you can chat with other LLMs and it shows in chat like this:
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
The Inspector will provide a URL to access debugging tools in your browser.
Acknowledgements
- Obviously the modelcontextprotocol and Anthropic team for the MCP Specification and integration into Claude Desktop. https://modelcontextprotocol.io/introduction
- PyroPrompts for sponsoring this project. Use code
CLAUDEANYCHAT
for 20 free automation credits on Pyroprompts.
You must be authenticated.
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.