Skip to main content
Glama

FastAPI CRUD MCP

by brunolnetto

FastAPI CRUD MCP

A minimal CRUD API for “items,” built with FastAPI and exposed as MCP tools via FastAPI-MCP. Includes a scenario-driven client harness using PydanticAI and Rich.


🚀 Features

  • FastAPI: high-performance HTTP API
  • SQLAlchemy + Pydantic: ORM models + input/output schemas
  • FastAPI-MCP: auto-expose your endpoints as MCP tools (/mcp/tools, /mcp/events)
  • Rich CLI: beautiful, colored terminal output for scenario runs
  • Scenario Runner: client harness that drives and validates your API via PydanticAI agents
  • SQLite backend for demo; easily swap to PostgreSQL, MySQL, etc.

📦 Project Layout

. ├── backend │ ├── server │ │ ├── main.py # FastAPI + FastAPI-MCP wiring │ │ ├── models.py # SQLAlchemy + Pydantic schemas │ │ ├── routes.py # CRUD endpoints │ │ ├── crud.py # DB operations │ │ ├── db.py # session & engine │ │ └── logger.py # stdlib logging setup │ └── client │ ├── scenarios.py # Scenario definitions │ └── main.py # run\_scenarios.py harness ├── .env # example environment variables ├── pyproject.toml # Project dependencies └── README.md # this file

⚙️ Installation & Setup

  1. Clone & enter directory
    git clone https://github.com/yourusername/fastapi-crud-mcp.git cd fastapi-crud-mcp
  2. Create & activate a virtualenv
    uv venv source .venv/bin/activate
  3. Install dependencies
    uv sync
  4. Environment variables Copy the example and adjust if needed:
    cp .env.example .env
    MCP_HOST_URL='http://127.0.0.1:8000/mcp' LLM_PROVIDER='openai' LLM_MODEL_NAME='gpt-4o-mini' LLM_MODEL=${LLM_PROVIDER}:${LLM_MODEL_NAME} OPENAI_API_KEY=sk-proj-your-api-key-here

🏃 Running the Server

docker compose up -d --build
  • API docshttp://localhost:8000/docs
  • OpenAPI JSONhttp://localhost:8000/openapi.json

🤖 Running the Scenario Client

python3 -m backend.client.main

This harness will:

  1. Load your .env settings
  2. Spin up a PydanticAI agent against MCP_HOST_URL
  3. Execute each scenario (create/list/get/update/delete)
  4. Display rich panels for prompts & outputs

🚨 Notes & Tips

  • Switch DB: edit backend/server/db.py for PostgreSQL or MySQL.
  • Add auth: protect /mcp or /api via FastAPI dependencies.
  • Extend scenarios: drop new entries into backend/client/scenarios.py.
  • Production: add Alembic for migrations, and monitor with Prometheus.

🤝 Contributing

  1. Fork 🔱
  2. Create a feature branch:
    git checkout -b feature/my-feature
  3. Commit & push:
    git commit -am "Add awesome feature" git push origin feature/my-feature
  4. Open a PR and we’ll review!

📄 License

This project is MIT-licensed—see the LICENSE file for details.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A minimal CRUD API for items that exposes FastAPI endpoints as MCP tools, enabling natural language interaction with a database through PydanticAI agents.

  1. 🚀 Features
    1. 📦 Project Layout
      1. ⚙️ Installation & Setup
        1. 🏃 Running the Server
          1. 🤖 Running the Scenario Client
            1. 🚨 Notes & Tips
              1. 🤝 Contributing
                1. 📄 License

                  Related MCP Servers

                  • -
                    security
                    F
                    license
                    -
                    quality
                    A production-ready MCP server built with FastAPI, providing an enhanced tool registry for creating, managing, and documenting AI tools for Large Language Models (LLMs).
                    Last updated -
                    13
                    Python
                  • -
                    security
                    F
                    license
                    -
                    quality
                    A FastMCP server that provides natural language interaction with MS SQL databases, enabling users to query data, list tables, describe structures, and execute database operations through a conversational AI interface.
                    Last updated -
                    Python
                  • -
                    security
                    A
                    license
                    -
                    quality
                    A server that integrates the MCP library with OpenAI's API, allowing users to interact with various tools, such as the weather tool, through natural language queries.
                    Last updated -
                    Python
                    MIT License
                  • -
                    security
                    A
                    license
                    -
                    quality
                    A zero-configuration tool that automatically converts FastAPI endpoints into Model Context Protocol (MCP) tools, enabling AI systems to interact with your API through natural language.
                    Last updated -
                    1
                    Python
                    MIT License

                  View all related MCP servers

                  MCP directory API

                  We provide all the information about MCP servers via our MCP API.

                  curl -X GET 'https://glama.ai/api/mcp/v1/servers/brunolnetto/fastapi-crud-mcp'

                  If you have feedback or need assistance with the MCP directory API, please join our Discord server