Provides built-in integration with OpenAI models for answering questions and processing user queries, allowing tools defined in the MCP server to be automatically called when needed by the language model.
🚀 Features
- Ultra-minimal setup: Start a server or client in 2 lines.
- Easy tool creation: Write normal functions in your
tools.py
file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically. - OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
🖥️ Quickstart
1. Install Requirements
2. Create Your Tools
Define your functions in tools.py
. No decorators needed, they are automatically added to your MCP server as tools. For example:
3. Start the MCP Server (2 lines)
4. Set up the MCP Client (2 lines)
5. Use the MCP Client
6. Close the MCP Client when you are done
🚀 Features
- Ultra-minimal setup: Start a server or client in 2 lines.
- Easy tool creation: Write normal functions in your
tools.py
file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically. - OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
📝 How It Works
- Server: Loads all top-level functions from
tools.py
and exposes them as MCP tools via HTTP. - Client: Discovers available tools, sends prompts to OpenAI, and automatically calls tools if needed.
🛠️ Example Output
When you run the client, you’ll see:
🔑 Requirements
- Python 3.11+
- OpenAI API key (for the client)
📢 Why MCP123?
- Zero boilerplate: No need to write schemas or wrappers—just write functions.
- LLM-native: Designed for seamless LLM tool use.
- Extensible: Add more tools by simply adding functions.
🤝 Credits
- Built with FastMCP
- Inspired by the Model Context Protocol (MCP)
📬 Feedback & Contributions
Pull requests and issues are welcome, but only if they are in ALL-CAPS.
This server cannot be installed
A minimal Python package for easily setting up and running MCP servers and clients, allowing functions to be automatically exposed as tools that LLMs can use with just 2 lines of code.
Related MCP Servers
- -securityFlicense-qualityThis is an MCP server that facilitates building tools for interacting with various APIs and workflows, supporting Python-based development with potential for customizable prompts and user configurations.Last updated -Python
- AsecurityAlicenseAqualityA server that provides a persistent Python REPL environment through the MCP protocol, allowing execution of Python code, variable management, and package installation.Last updated -33PythonMIT License
Fused MCP Agentsofficial
-securityAlicense-qualityA Python-based MCP server that allows Claude and other LLMs to execute arbitrary Python code directly through your desktop Claude app, enabling data scientists to connect LLMs to APIs and executable code.Last updated -23MIT License- -securityFlicense-qualityA demonstration server showing MCP implementation in Python with resource handling, tool operations, and reusable prompts for a simple user/post system with local database.Last updated -Python