Skip to main content
Glama

MCP Chat Backend

This project is a serverless FastAPI backend for a chatbot that generates and executes SQL queries on a Postgres database using OpenAI's GPT models, then returns structured, UI-friendly responses. It is designed to run on AWS Lambda via AWS SAM, but can also be run locally or in Docker.

Features

  • FastAPI REST API with a single /ask endpoint

  • Uses OpenAI GPT models to generate and summarize SQL queries

  • Connects to a Postgres (Supabase) database

  • Returns structured JSON responses for easy frontend rendering

  • CORS enabled for frontend integration

  • Deployable to AWS Lambda (SAM), or run locally/Docker

  • Verbose logging for debugging (CloudWatch)

Project Structure

├── main.py # Main FastAPI app and Lambda handler ├── requirements.txt # Python dependencies ├── template.yaml # AWS SAM template for Lambda deployment ├── samconfig.toml # AWS SAM deployment config ├── Dockerfile # For local/Docker deployment ├── .gitignore # Files to ignore in git └── .env # (Not committed) Environment variables

Setup

1. Clone the repository

git clone <your-repo-url> cd mcp-chat-3

2. Install Python dependencies

python -m venv .venv source .venv/bin/activate # or .venv\Scripts\activate on Windows pip install -r requirements.txt

3. Set up environment variables

Create a .env file (not committed to git):

OPENAI_API_KEY=your-openai-key SUPABASE_DB_NAME=your-db SUPABASE_DB_USER=your-user SUPABASE_DB_PASSWORD=your-password SUPABASE_DB_HOST=your-host SUPABASE_DB_PORT=your-port

Running Locally

With Uvicorn

uvicorn main:app --reload --port 8080

With Docker

docker build -t mcp-chat-backend . docker run -p 8080:8080 --env-file .env mcp-chat-backend

Deploying to AWS Lambda (SAM)

  1. Install AWS SAM CLI

  2. Build and deploy:

sam build sam deploy --guided
  • Configure environment variables in template.yaml or via the AWS Console.

  • The API will be available at the endpoint shown after deployment (e.g. https://xxxxxx.execute-api.region.amazonaws.com/Prod/ask).

API Usage

POST /ask

  • Body: { "question": "your question here" }

  • Response: Structured JSON for chatbot UI, e.g.

{ "messages": [ { "type": "text", "content": "Sample 588 has a resistance of 1.2 ohms.", "entity": { "entity_type": "sample", "id": "588" } }, { "type": "list", "items": ["Item 1", "Item 2"] } ] }
  • See main.py for the full schema and more details.

Environment Variables

  • OPENAI_API_KEY: Your OpenAI API key

  • SUPABASE_DB_NAME, SUPABASE_DB_USER, SUPABASE_DB_PASSWORD, SUPABASE_DB_HOST, SUPABASE_DB_PORT: Your Postgres database credentials

Development Notes

  • All logs are sent to stdout (and CloudWatch on Lambda)

  • CORS is enabled for all origins by default

  • The backend expects the frontend to handle the structured response format

License

MIT (or your license here)

-
security - not tested
F
license - not found
-
quality - not tested

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    A server that allows AI models to interact with PostgreSQL databases through a standardized protocol, providing database schema information and SQL query execution capabilities.
    Last updated -
    1
  • A
    security
    A
    license
    A
    quality
    Enables querying documents through a Langflow backend using natural language questions, providing an interface to interact with Langflow document Q\&A flows.
    Last updated -
    1
    14
    MIT License
    • Apple
  • -
    security
    F
    license
    -
    quality
    A FastMCP server that enables natural language querying of PostgreSQL databases through LLM integration, allowing users to generate SQL queries from plain English and visualize the results.
    Last updated -
    3
  • -
    security
    A
    license
    -
    quality
    A Node.js package that provides a Model Context Protocol server for converting natural language questions into SQL queries using AI-powered multi-agent systems.
    Last updated -
    6
    2
    MIT License
    • Apple
    • Linux

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rick-noya/mcp-chatbot'

If you have feedback or need assistance with the MCP directory API, please join our Discord server