Skip to main content
Glama

MCP-123

by Tylersuard

🚀 Features

  • Ultra-minimal setup: Start a server or client in 2 lines.
  • Easy tool creation: Write normal functions in your tools.py file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.
  • OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.

🖥️ Quickstart

1. Install Requirements

pip install -r requirements.txt

2. Create Your Tools

Define your functions in tools.py. No decorators needed, they are automatically added to your MCP server as tools. For example:

def add(a: int, b: int) -> int: """Add two numbers.""" return a + b

3. Start the MCP Server (2 lines)

from mcp123 import server server.run_server("tools.py", port=9999)

4. Set up the MCP Client (2 lines)

from mcp123.client import McpClient client = McpClient("http://localhost:9999", "sk-...your OpenAI key...")

5. Use the MCP Client

answer = client.ask("Add 15 and 14.") print("Answer:", answer)

6. Close the MCP Client when you are done

client.close()

🚀 Features

  • Ultra-minimal setup: Start a server or client in 2 lines.
  • Easy tool creation: Write normal functions in your tools.py file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.
  • OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.

📝 How It Works

  • Server: Loads all top-level functions from tools.py and exposes them as MCP tools via HTTP.
  • Client: Discovers available tools, sends prompts to OpenAI, and automatically calls tools if needed.

🛠️ Example Output

When you run the client, you’ll see:

Tools discovered: [ ...list of tools... ] Answer: 29

🔑 Requirements

  • Python 3.11+
  • OpenAI API key (for the client)

📢 Why MCP123?

  • Zero boilerplate: No need to write schemas or wrappers—just write functions.
  • LLM-native: Designed for seamless LLM tool use.
  • Extensible: Add more tools by simply adding functions.

🤝 Credits

  • Built with FastMCP
  • Inspired by the Model Context Protocol (MCP)

📬 Feedback & Contributions

Pull requests and issues are welcome, but only if they are in ALL-CAPS.

-
security - not tested
F
license - not found
-
quality - not tested

A minimal Python package for easily setting up and running MCP servers and clients, allowing functions to be automatically exposed as tools that LLMs can use with just 2 lines of code.

  1. 🖥️ Quickstart
    1. Install Requirements
    2. Create Your Tools
    3. Start the MCP Server (2 lines)
    4. Set up the MCP Client (2 lines)
    5. Use the MCP Client
    6. Close the MCP Client when you are done
  2. 🚀 Features
    1. 📝 How It Works
      1. 🛠️ Example Output
        1. 🔑 Requirements
          1. 📢 Why MCP123?
            1. 🤝 Credits
              1. 📬 Feedback & Contributions

                Related MCP Servers

                • -
                  security
                  F
                  license
                  -
                  quality
                  This is an MCP server that facilitates building tools for interacting with various APIs and workflows, supporting Python-based development with potential for customizable prompts and user configurations.
                  Last updated -
                  Python
                • A
                  security
                  A
                  license
                  A
                  quality
                  A server that provides a persistent Python REPL environment through the MCP protocol, allowing execution of Python code, variable management, and package installation.
                  Last updated -
                  3
                  3
                  Python
                  MIT License
                • -
                  security
                  A
                  license
                  -
                  quality
                  A Python-based MCP server that allows Claude and other LLMs to execute arbitrary Python code directly through your desktop Claude app, enabling data scientists to connect LLMs to APIs and executable code.
                  Last updated -
                  23
                  MIT License
                  • Apple
                  • Linux
                • -
                  security
                  F
                  license
                  -
                  quality
                  A demonstration server showing MCP implementation in Python with resource handling, tool operations, and reusable prompts for a simple user/post system with local database.
                  Last updated -
                  Python
                  • Apple
                  • Linux

                View all related MCP servers

                MCP directory API

                We provide all the information about MCP servers via our MCP API.

                curl -X GET 'https://glama.ai/api/mcp/v1/servers/Tylersuard/MCP-123'

                If you have feedback or need assistance with the MCP directory API, please join our Discord server