Provides built-in integration with OpenAI models for answering questions and processing user queries, allowing tools defined in the MCP server to be automatically called when needed by the language model.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP-123create a tool that calculates the average of three numbers"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
🚀 Features
Ultra-minimal setup: Start a server or client in 2 lines.
Easy tool creation: Write normal functions in your
tools.pyfile—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
Related MCP server: MCP Python Tutorial
🖥️ Quickstart
1. Install Requirements
2. Create Your Tools
Define your functions in tools.py. No decorators needed, they are automatically added to your MCP server as tools. For example:
3. Start the MCP Server (2 lines)
4. Set up the MCP Client (2 lines)
5. Use the MCP Client
6. Close the MCP Client when you are done
🚀 Features
Ultra-minimal setup: Start a server or client in 2 lines.
Easy tool creation: Write normal functions in your
tools.pyfile—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
📝 How It Works
Server: Loads all top-level functions from
tools.pyand exposes them as MCP tools via HTTP.Client: Discovers available tools, sends prompts to OpenAI, and automatically calls tools if needed.
🛠️ Example Output
When you run the client, you’ll see:
🔑 Requirements
Python 3.11+
OpenAI API key (for the client)
📢 Why MCP123?
Zero boilerplate: No need to write schemas or wrappers—just write functions.
LLM-native: Designed for seamless LLM tool use.
Extensible: Add more tools by simply adding functions.
🤝 Credits
Built with FastMCP
Inspired by the Model Context Protocol (MCP)
📬 Feedback & Contributions
Pull requests and issues are welcome, but only if they are in ALL-CAPS.