Skip to main content
Glama
jemhakdog
by jemhakdog

Backend Architect MCP Server

An expert MCP toolchain designed to act as a Backend Architect for AI agents. This server enforces a strict "Atomic Development" workflow for building Python FastAPI + Supabase backends.

🚀 Overview

The Backend Architect server guides an agent through a Plan -> Prompt -> Write loop, ensuring that database models, API routes, and tests are built in the correct dependency order.

Key Features

  • Atomic Development: Focuses on one component at a time.

  • Workflow Enforcement: Models → Routes → Tests (respects model dependencies).

  • Auto-Imports: Automatically updates __init__.py files for models and routes.

  • State Persistence: Maintains .mcp_state.json to track building progress.

  • Contextual Prompts: Generates specialized system prompts for each component.

🛠️ Tech Stack

  • Python 3.12

  • MCP SDK (FastMCP)

  • UV (Dependency Manager)

  • Pydantic (State Validation)

📦 Installation

Ensure you have uv installed. Then, clone the repository and install dependencies:

# Clone the repository cd mcp_fastapi # Install dependencies and run the server uv run server.py

🛠️ Tools Reference

1. Initialization

  • initialize_project(root_path: str = "."): Scaffolds the FastAPI project structure and pyproject.toml. Defaults to the current working directory.

2. Planning

  • save_roles_plan(roles: list): Define user roles and permissions.

  • save_database_plan(models: list): Define SQLModel schemas and relationships.

  • save_route_plan(routes: list): Define API endpoints and methods.

  • save_test_plan(tests: list): Define simulation scenarios.

3. Execution

  • get_next_pending_task(): The "Traffic Cop" that tells you exactly what to build next.

  • get_file_instruction(task_type: str, task_name: str): Returns a strict system prompt for the AI to follow.

  • write_component_file(type: str, name: str, content: str): Writes the code and marks the task as "done".

🔄 The Loop

  1. Initialize: Set up your project root.

  2. Plan: Feed the architect your schemas and endpoints.

  3. Draft: Ask get_next_pending_task() for the current objective.

  4. Learn: Get instructions via get_file_instruction().

  5. Write: Submit code via write_component_file().

  6. Repeat: Until the entire backend is architected.

⚙️ MCP Configuration

Add this to your MCP settings file (e.g., mcp_config.json or your IDE's MCP settings):

{ "mcpServers": { "backend-architect": { "command": "uv", "args": [ "run", "--project", "/path/to/server/directory", "python", "server.py" ] } } }
TIP

Use the absolute path to the directory where you cloned this repository for the--project argument. This ensures the server can find its dependencies regardless of where your AI agent is currently working.


Built with ❤️ for the AI-First Developer.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jemhakdog/mcp_fastapi'

If you have feedback or need assistance with the MCP directory API, please join our Discord server