Skip to main content
Glama

Python MCP Server

Python MCP Server

This is a simple demonstration of a fastmcp server and client.

Setup and Installation (with uv)

This project uses uv for environment and dependency management.

  1. Create the virtual environment:

    uv venv
  2. Activate the virtual environment:

    source .venv/bin/activate
  3. Install the dependencies:

    uv pip sync pyproject.toml

Usage (with uv)

Make sure your virtual environment is activated before running these commands.

  1. Run the MCP server:

    uv run python server.py

    The server will start on localhost:8000.

Usage for gemini

  1. Configue Gemini CLI Add the following configuration on ~/.gemini/settings.json

    { "security": { "auth": { "selectedType": "oauth-personal" } }, "mcpServers": { "serverName": { "command": "/home/<USER>/.local/bin/uv", "args": [ "--directory", "/home/<USER>/code/python-mcp-server", "run", "server.py" ] } } }

    And yes, the home directory can't be summarized to just "~"

  2. Th server does not need to be running simply run Gemini CLI:

    gemini

    From there, you will be able to know if the MCP has been adopted by running the command /mcp list. You should see the tools listed there, similar to this:

    ℹConfigured MCP servers: 🟢 serverName - Ready (2 tools) Tools: - addNumbers - greet 💡 Tips: • Use /mcp desc to show server and tool descriptions • Use /mcp schema to show tool parameter schemas • Use /mcp nodesc to hide descriptions • Use /mcp auth <server-name> to authenticate with OAuth-enabled servers • Press Ctrl+T to toggle tool descriptions on/off

    In this example you will be able to ask Gemini to greet you or add numbers, as long as you provide enough context to fill in the arguments. If you do not provide arguments, Gemini will ask for more information.

    Once all the arguments are provided, Gemini will ask for confirmation before the MCP is hit, and only then will it execute the logic.

Deploy Server
-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

A demonstration MCP server built with fastmcp that provides basic utility tools including number addition and greeting functionality. Integrates with Gemini CLI to enable natural language interaction with simple mathematical and greeting operations.

  1. Setup and Installation (with uv)
    1. Usage (with uv)
      1. Usage for gemini

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/Yoshua-Carrera/python-mcp-server'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server