Integrations
Allows deployment of the chatbot backend as a serverless function, with CloudWatch integration for logging and API Gateway for endpoint exposure.
Supports containerized deployment of the chatbot backend with Docker, including environment variable configuration.
Implements a REST API with CORS support and structured JSON responses for easy frontend integration.
MCP Chat Backend
This project is a serverless FastAPI backend for a chatbot that generates and executes SQL queries on a Postgres database using OpenAI's GPT models, then returns structured, UI-friendly responses. It is designed to run on AWS Lambda via AWS SAM, but can also be run locally or in Docker.
Features
- FastAPI REST API with a single
/ask
endpoint - Uses OpenAI GPT models to generate and summarize SQL queries
- Connects to a Postgres (Supabase) database
- Returns structured JSON responses for easy frontend rendering
- CORS enabled for frontend integration
- Deployable to AWS Lambda (SAM), or run locally/Docker
- Verbose logging for debugging (CloudWatch)
Project Structure
Setup
1. Clone the repository
2. Install Python dependencies
3. Set up environment variables
Create a .env
file (not committed to git):
Running Locally
With Uvicorn
With Docker
Deploying to AWS Lambda (SAM)
- Install AWS SAM CLI
- Build and deploy:
- Configure environment variables in
template.yaml
or via the AWS Console. - The API will be available at the endpoint shown after deployment (e.g.
https://xxxxxx.execute-api.region.amazonaws.com/Prod/ask
).
API Usage
POST /ask
- Body:
{ "question": "your question here" }
- Response: Structured JSON for chatbot UI, e.g.
- See
main.py
for the full schema and more details.
Environment Variables
OPENAI_API_KEY
: Your OpenAI API keySUPABASE_DB_NAME
,SUPABASE_DB_USER
,SUPABASE_DB_PASSWORD
,SUPABASE_DB_HOST
,SUPABASE_DB_PORT
: Your Postgres database credentials
Development Notes
- All logs are sent to stdout (and CloudWatch on Lambda)
- CORS is enabled for all origins by default
- The backend expects the frontend to handle the structured response format
License
MIT (or your license here)
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A serverless backend that enables natural language querying of a Postgres database, converting user questions into SQL and returning structured, UI-friendly responses.
Related MCP Servers
- -securityFlicense-qualityA server that enables interaction with PostgreSQL, MySQL, MariaDB, or SQLite databases through Claude Desktop using natural language queries.Last updated -Python
- -securityFlicense-qualityA server that allows AI models to interact with PostgreSQL databases through a standardized protocol, providing database schema information and SQL query execution capabilities.Last updated -JavaScript
- -securityAlicense-qualityAn MCP server that provides natural language interaction with Apache AGE graph databases, allowing users to query, visualize and manipulate graph data in PostgreSQL through Claude AI.Last updated -1PythonMIT License
- -securityAlicense-qualityAn MCP server that enables natural language querying of Supabase PostgreSQL databases using Claude 3.7, allowing users to inspect schemas, execute SQL, manage migrations, and convert natural language to SQL queries.Last updated -PythonMIT License