Provides built-in integration with OpenAI models for answering questions and processing user queries, allowing tools defined in the MCP server to be automatically called when needed by the language model.
๐ Features
Ultra-minimal setup: Start a server or client in 2 lines.
Easy tool creation: Write normal functions in your
tools.pyfileโno decorators or special wrappers neededโand they get included as tools that your MCP server can use automatically.OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
๐ฅ๏ธ Quickstart
1. Install Requirements
2. Create Your Tools
Define your functions in tools.py. No decorators needed, they are automatically added to your MCP server as tools. For example:
3. Start the MCP Server (2 lines)
4. Set up the MCP Client (2 lines)
5. Use the MCP Client
6. Close the MCP Client when you are done
๐ Features
Ultra-minimal setup: Start a server or client in 2 lines.
Easy tool creation: Write normal functions in your
tools.pyfileโno decorators or special wrappers neededโand they get included as tools that your MCP server can use automatically.OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
๐ How It Works
Server: Loads all top-level functions from
tools.pyand exposes them as MCP tools via HTTP.Client: Discovers available tools, sends prompts to OpenAI, and automatically calls tools if needed.
๐ ๏ธ Example Output
When you run the client, youโll see:
๐ Requirements
Python 3.11+
OpenAI API key (for the client)
๐ข Why MCP123?
Zero boilerplate: No need to write schemas or wrappersโjust write functions.
LLM-native: Designed for seamless LLM tool use.
Extensible: Add more tools by simply adding functions.
๐ค Credits
Built with FastMCP
Inspired by the Model Context Protocol (MCP)
๐ฌ Feedback & Contributions
Pull requests and issues are welcome, but only if they are in ALL-CAPS.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A minimal Python package for easily setting up and running MCP servers and clients, allowing functions to be automatically exposed as tools that LLMs can use with just 2 lines of code.
Related MCP Servers
- -security-license-qualityThis is an MCP server that facilitates building tools for interacting with various APIs and workflows, supporting Python-based development with potential for customizable prompts and user configurations.Last updated -
- -security-license-qualityA demonstration server showing MCP implementation in Python with resource handling, tool operations, and reusable prompts for a simple user/post system with local database.Last updated -
- Asecurity-licenseAqualityA Python-based server that helps users easily install and configure other MCP servers across different platforms.Last updated -23
- Asecurity-licenseAqualityA template repository for creating custom Model Context Protocol (MCP) servers in Python that can be integrated with applications like Claude Desktop or Cursor.Last updated -35MIT License