Skip to main content
Glama

🚀 Features

  • Ultra-minimal setup: Start a server or client in 2 lines.

  • Easy tool creation: Write normal functions in your tools.py file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.

  • OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.


🖥️ Quickstart

1. Install Requirements

pip install -r requirements.txt

2. Create Your Tools

Define your functions in tools.py. No decorators needed, they are automatically added to your MCP server as tools. For example:

def add(a: int, b: int) -> int: """Add two numbers.""" return a + b

3. Start the MCP Server (2 lines)

from mcp123 import server server.run_server("tools.py", port=9999)

4. Set up the MCP Client (2 lines)

from mcp123.client import McpClient client = McpClient("http://localhost:9999", "sk-...your OpenAI key...")

5. Use the MCP Client

answer = client.ask("Add 15 and 14.") print("Answer:", answer)

6. Close the MCP Client when you are done

client.close()

🚀 Features

  • Ultra-minimal setup: Start a server or client in 2 lines.

  • Easy tool creation: Write normal functions in your tools.py file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.

  • OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.


📝 How It Works

  • Server: Loads all top-level functions from tools.py and exposes them as MCP tools via HTTP.

  • Client: Discovers available tools, sends prompts to OpenAI, and automatically calls tools if needed.


🛠️ Example Output

When you run the client, you’ll see:

Tools discovered: [ ...list of tools... ] Answer: 29

🔑 Requirements

  • Python 3.11+

  • OpenAI API key (for the client)


📢 Why MCP123?

  • Zero boilerplate: No need to write schemas or wrappers—just write functions.

  • LLM-native: Designed for seamless LLM tool use.

  • Extensible: Add more tools by simply adding functions.


🤝 Credits

  • Built with FastMCP

  • Inspired by the Model Context Protocol (MCP)


📬 Feedback & Contributions

Pull requests and issues are welcome, but only if they are in ALL-CAPS.

-
security - not tested
F
license - not found
-
quality - not tested

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    This is an MCP server that facilitates building tools for interacting with various APIs and workflows, supporting Python-based development with potential for customizable prompts and user configurations.
    Last updated -
  • -
    security
    F
    license
    -
    quality
    A demonstration server showing MCP implementation in Python with resource handling, tool operations, and reusable prompts for a simple user/post system with local database.
    Last updated -
    • Apple
    • Linux
  • A
    security
    F
    license
    A
    quality
    A Python-based server that helps users easily install and configure other MCP servers across different platforms.
    Last updated -
    2
    3
    • Apple
    • Linux
  • A
    security
    A
    license
    A
    quality
    A template repository for creating custom Model Context Protocol (MCP) servers in Python that can be integrated with applications like Claude Desktop or Cursor.
    Last updated -
    3
    6
    MIT License
    • Apple
    • Linux

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Tylersuard/MCP-123'

If you have feedback or need assistance with the MCP directory API, please join our Discord server