Skip to main content
Glama
ChiragPatankar

AI Customer Support Bot - MCP Server

πŸ€– AI Customer Support Bot - MCP Server

Python FastAPI PostgreSQL MCP License

A modern, extensible MCP server framework for building AI-powered customer support systems

Features β€’ Quick Start β€’ API Reference β€’ Architecture β€’ Contributing


🌟 Overview

A Model Context Protocol (MCP) compliant server framework built with modern Python. Designed for developers who want to create intelligent customer support systems without vendor lock-in. Clean architecture, battle-tested patterns, and ready for any AI provider.

graph TB Client[HTTP Client] --> API[API Server] API --> MW[Middleware Layer] MW --> SVC[Service Layer] SVC --> CTX[Context Manager] SVC --> AI[AI Integration] SVC --> DAL[Data Access Layer] DAL --> DB[(PostgreSQL)]

Related MCP server: MCP Starter

✨ Features

πŸ—οΈ Clean Architecture
Layered design with clear separation of concerns

πŸ“‘ MCP Compliant
Full Model Context Protocol implementation

πŸ”’ Production Ready
Auth, rate limiting, monitoring included

πŸš€ High Performance
Built on FastAPI with async support

πŸ”Œ AI Agnostic
Integrate any AI provider easily

πŸ“Š Health Monitoring
Comprehensive metrics and diagnostics

πŸ›‘οΈ Secure by Default
Token auth and input validation

πŸ“¦ Batch Processing
Handle multiple queries efficiently

πŸš€ Quick Start

Prerequisites

  • Python 3.8+

  • PostgreSQL

  • Your favorite AI service (OpenAI, Anthropic, etc.)

Installation

# Clone and setup git clone https://github.com/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server.git cd AI-Customer-Support-Bot--MCP-Server # Create virtual environment python -m venv venv source venv/bin/activate # Windows: venv\Scripts\activate # Install dependencies pip install -r requirements.txt # Setup environment cp .env.example .env # Edit .env with your configuration

Configuration

# .env file DATABASE_URL=postgresql://user:password@localhost/customer_support_bot SECRET_KEY=your-super-secret-key RATE_LIMIT_REQUESTS=100 RATE_LIMIT_PERIOD=60

Run

# Setup database createdb customer_support_bot # Start server python app.py # πŸš€ Server running at http://localhost:8000

πŸ“‘ API Reference

Health Check

GET /mcp/health

Process Single Query

POST /mcp/process Content-Type: application/json X-MCP-Auth: your-token X-MCP-Version: 1.0 { "query": "How do I reset my password?", "priority": "high" }

Batch Processing

POST /mcp/batch Content-Type: application/json X-MCP-Auth: your-token { "queries": [ "How do I reset my password?", "What are your business hours?" ] }

Success Response

{ "status": "success", "data": { "response": "Generated AI response", "confidence": 0.95, "processing_time": "120ms" }, "meta": { "request_id": "req_123456", "timestamp": "2024-02-14T12:00:00Z" } }

Error Response

{ "code": "RATE_LIMIT_EXCEEDED", "message": "Rate limit exceeded", "details": { "retry_after": 60, "timestamp": "2024-02-14T12:00:00Z" } }

πŸ—οΈ Architecture

Project Structure

πŸ“¦ AI-Customer-Support-Bot--MCP-Server β”œβ”€β”€ πŸš€ app.py # FastAPI application β”œβ”€β”€ πŸ—„οΈ database.py # Database configuration β”œβ”€β”€ πŸ›‘οΈ middleware.py # Auth & rate limiting β”œβ”€β”€ πŸ“‹ models.py # ORM models β”œβ”€β”€ βš™οΈ mcp_config.py # MCP protocol config β”œβ”€β”€ πŸ“„ requirements.txt # Dependencies └── πŸ“ .env.example # Environment template

Layer Responsibilities

Layer

Purpose

Components

API

HTTP endpoints, validation

FastAPI routes, Pydantic models

Middleware

Auth, rate limiting, logging

Token validation, request throttling

Service

Business logic, AI integration

Context management, AI orchestration

Data

Persistence, models

PostgreSQL, SQLAlchemy ORM

πŸ”Œ Extending with AI Services

Add Your AI Provider

  1. Install your AI SDK:

pip install openai # or anthropic, cohere, etc.
  1. Configure environment:

# Add to .env AI_SERVICE_API_KEY=sk-your-api-key AI_SERVICE_MODEL=gpt-4
  1. Implement service integration:

# In service layer class AIService: async def generate_response(self, query: str, context: dict) -> str: # Your AI integration here return ai_response

πŸ”§ Development

Running Tests

pytest tests/

Code Quality

# Format code black . # Lint flake8 # Type checking mypy .

Docker Support

# Coming soon - Docker containerization

πŸ“Š Monitoring & Observability

Health Metrics

  • βœ… Service uptime

  • πŸ”— Database connectivity

  • πŸ“ˆ Request rates

  • ⏱️ Response times

  • πŸ’Ύ Memory usage

Logging

# Structured logging included { "timestamp": "2024-02-14T12:00:00Z", "level": "INFO", "message": "Query processed", "request_id": "req_123456", "processing_time": 120 }

πŸ”’ Security

Built-in Security Features

  • πŸ” Token Authentication - Secure API access

  • πŸ›‘οΈ Rate Limiting - DoS protection

  • βœ… Input Validation - SQL injection prevention

  • πŸ“ Audit Logging - Request tracking

  • πŸ”’ Environment Secrets - Secure config management

πŸš€ Deployment

Environment Setup

# Production environment variables DATABASE_URL=postgresql://prod-user:password@prod-host/db RATE_LIMIT_REQUESTS=1000 LOG_LEVEL=WARNING

Scaling Considerations

  • Use connection pooling for database

  • Implement Redis for rate limiting in multi-instance setups

  • Add load balancer for high availability

  • Monitor with Prometheus/Grafana

🀝 Contributing

We love contributions! Here's how to get started:

Development Setup

# Fork the repo, then: git clone https://github.com/your-username/AI-Customer-Support-Bot--MCP-Server.git cd AI-Customer-Support-Bot--MCP-Server # Create feature branch git checkout -b feature/amazing-feature # Make your changes # ... # Test your changes pytest # Submit PR

Contribution Guidelines

  • πŸ“ Write tests for new features

  • πŸ“š Update documentation

  • 🎨 Follow existing code style

  • βœ… Ensure CI passes

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.


Built with ❀️ by

⭐ Star this repo if you find it helpful! ⭐

Report Bug β€’ Request Feature β€’ Documentation

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server