Skip to main content
Glama
README.mdβ€’3.59 kB
[![codecov](https://codecov.io/gh/brunolnetto/fastapi-crud-mcp/graph/badge.svg?token=71M0EVUD98)](https://codecov.io/gh/brunolnetto/fastapi-crud-mcp) [![Python Version](https://img.shields.io/badge/python-3.8%2B-blue)](https://www.python.org/) [![License: MIT](https://img.shields.io/badge/license-MIT-green)](LICENSE) # FastAPI CRUD MCP A minimal CRUD API for β€œitems,” built with FastAPI and exposed as MCP tools via FastAPI-MCP. Includes a scenario-driven client harness using PydanticAI and Rich. --- ## πŸš€ Features - **FastAPI**: high-performance HTTP API - **SQLAlchemy + Pydantic**: ORM models + input/output schemas - **FastAPI-MCP**: auto-expose your endpoints as MCP tools (`/mcp/tools`, `/mcp/events`) - **Rich CLI**: beautiful, colored terminal output for scenario runs - **Scenario Runner**: client harness that drives and validates your API via PydanticAI agents - **SQLite backend** for demo; easily swap to PostgreSQL, MySQL, etc. --- ## πŸ“¦ Project Layout ``` . β”œβ”€β”€ backend β”‚ β”œβ”€β”€ server β”‚ β”‚ β”œβ”€β”€ main.py # FastAPI + FastAPI-MCP wiring β”‚ β”‚ β”œβ”€β”€ models.py # SQLAlchemy + Pydantic schemas β”‚ β”‚ β”œβ”€β”€ routes.py # CRUD endpoints β”‚ β”‚ β”œβ”€β”€ crud.py # DB operations β”‚ β”‚ β”œβ”€β”€ db.py # session & engine β”‚ β”‚ └── logger.py # stdlib logging setup β”‚ └── client β”‚ β”œβ”€β”€ scenarios.py # Scenario definitions β”‚ └── main.py # run\_scenarios.py harness β”œβ”€β”€ .env # example environment variables β”œβ”€β”€ pyproject.toml # Project dependencies └── README.md # this file ```` --- ## βš™οΈ Installation & Setup 1. **Clone & enter directory** ```bash git clone https://github.com/yourusername/fastapi-crud-mcp.git cd fastapi-crud-mcp ``` 2. **Create & activate a virtualenv** ```bash uv venv source .venv/bin/activate ``` 3. **Install dependencies** ```bash uv sync ``` 4. **Environment variables** Copy the example and adjust if needed: ```bash cp .env.example .env ``` ```env MCP_HOST_URL='http://127.0.0.1:8000/mcp' LLM_PROVIDER='openai' LLM_MODEL_NAME='gpt-4o-mini' LLM_MODEL=${LLM_PROVIDER}:${LLM_MODEL_NAME} OPENAI_API_KEY=sk-proj-your-api-key-here ``` --- ## πŸƒ Running the Server ```bash docker compose up -d --build ``` * **API docs** β†’ `http://localhost:8000/docs` * **OpenAPI JSON** β†’ `http://localhost:8000/openapi.json` --- ## πŸ€– Running the Scenario Client ```bash python3 -m backend.client.main ``` This harness will: 1. Load your `.env` settings 2. Spin up a PydanticAI agent against `MCP_HOST_URL` 3. Execute each scenario (create/list/get/update/delete) 4. Display rich panels for prompts & outputs --- ## 🚨 Notes & Tips * **Switch DB**: edit `backend/server/db.py` for PostgreSQL or MySQL. * **Add auth**: protect `/mcp` or `/api` via FastAPI dependencies. * **Extend scenarios**: drop new entries into `backend/client/scenarios.py`. * **Production**: add Alembic for migrations, and monitor with Prometheus. --- ## 🀝 Contributing 1. Fork πŸ”± 2. Create a feature branch: ```bash git checkout -b feature/my-feature ``` 3. Commit & push: ```bash git commit -am "Add awesome feature" git push origin feature/my-feature ``` 4. Open a PR and we’ll review! --- ## πŸ“„ License This project is MIT-licensedβ€”see the [LICENSE](LICENSE) file for details.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/brunolnetto/fastapi-crud-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server