Skip to main content
Glama

AI Customer Support Bot - MCP Server

🤖 AI Customer Support Bot - MCP Server

Python FastAPI PostgreSQL MCP License

A modern, extensible MCP server framework for building AI-powered customer support systems

FeaturesQuick StartAPI ReferenceArchitectureContributing


🌟 Overview

A Model Context Protocol (MCP) compliant server framework built with modern Python. Designed for developers who want to create intelligent customer support systems without vendor lock-in. Clean architecture, battle-tested patterns, and ready for any AI provider.

✨ Features

🏗️ Clean Architecture
Layered design with clear separation of concerns

📡 MCP Compliant
Full Model Context Protocol implementation

🔒 Production Ready
Auth, rate limiting, monitoring included

🚀 High Performance
Built on FastAPI with async support

🔌 AI Agnostic
Integrate any AI provider easily

📊 Health Monitoring
Comprehensive metrics and diagnostics

🛡️ Secure by Default
Token auth and input validation

📦 Batch Processing
Handle multiple queries efficiently

🚀 Quick Start

Prerequisites

  • Python 3.8+
  • PostgreSQL
  • Your favorite AI service (OpenAI, Anthropic, etc.)

Installation

# Clone and setup git clone https://github.com/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server.git cd AI-Customer-Support-Bot--MCP-Server # Create virtual environment python -m venv venv source venv/bin/activate # Windows: venv\Scripts\activate # Install dependencies pip install -r requirements.txt # Setup environment cp .env.example .env # Edit .env with your configuration

Configuration

# .env file DATABASE_URL=postgresql://user:password@localhost/customer_support_bot SECRET_KEY=your-super-secret-key RATE_LIMIT_REQUESTS=100 RATE_LIMIT_PERIOD=60

Run

# Setup database createdb customer_support_bot # Start server python app.py # 🚀 Server running at http://localhost:8000

📡 API Reference

Health Check

GET /mcp/health

Process Single Query

POST /mcp/process Content-Type: application/json X-MCP-Auth: your-token X-MCP-Version: 1.0 { "query": "How do I reset my password?", "priority": "high" }

Batch Processing

POST /mcp/batch Content-Type: application/json X-MCP-Auth: your-token { "queries": [ "How do I reset my password?", "What are your business hours?" ] }

Success Response

{ "status": "success", "data": { "response": "Generated AI response", "confidence": 0.95, "processing_time": "120ms" }, "meta": { "request_id": "req_123456", "timestamp": "2024-02-14T12:00:00Z" } }

Error Response

{ "code": "RATE_LIMIT_EXCEEDED", "message": "Rate limit exceeded", "details": { "retry_after": 60, "timestamp": "2024-02-14T12:00:00Z" } }

🏗️ Architecture

Project Structure

📦 AI-Customer-Support-Bot--MCP-Server ├── 🚀 app.py # FastAPI application ├── 🗄️ database.py # Database configuration ├── 🛡️ middleware.py # Auth & rate limiting ├── 📋 models.py # ORM models ├── ⚙️ mcp_config.py # MCP protocol config ├── 📄 requirements.txt # Dependencies └── 📝 .env.example # Environment template

Layer Responsibilities

LayerPurposeComponents
APIHTTP endpoints, validationFastAPI routes, Pydantic models
MiddlewareAuth, rate limiting, loggingToken validation, request throttling
ServiceBusiness logic, AI integrationContext management, AI orchestration
DataPersistence, modelsPostgreSQL, SQLAlchemy ORM

🔌 Extending with AI Services

Add Your AI Provider

  1. Install your AI SDK:
pip install openai # or anthropic, cohere, etc.
  1. Configure environment:
# Add to .env AI_SERVICE_API_KEY=sk-your-api-key AI_SERVICE_MODEL=gpt-4
  1. Implement service integration:
# In service layer class AIService: async def generate_response(self, query: str, context: dict) -> str: # Your AI integration here return ai_response

🔧 Development

Running Tests

pytest tests/

Code Quality

# Format code black . # Lint flake8 # Type checking mypy .

Docker Support

# Coming soon - Docker containerization

📊 Monitoring & Observability

Health Metrics

  • ✅ Service uptime
  • 🔗 Database connectivity
  • 📈 Request rates
  • ⏱️ Response times
  • 💾 Memory usage

Logging

# Structured logging included { "timestamp": "2024-02-14T12:00:00Z", "level": "INFO", "message": "Query processed", "request_id": "req_123456", "processing_time": 120 }

🔒 Security

Built-in Security Features

  • 🔐 Token Authentication - Secure API access
  • 🛡️ Rate Limiting - DoS protection
  • Input Validation - SQL injection prevention
  • 📝 Audit Logging - Request tracking
  • 🔒 Environment Secrets - Secure config management

🚀 Deployment

Environment Setup

# Production environment variables DATABASE_URL=postgresql://prod-user:password@prod-host/db RATE_LIMIT_REQUESTS=1000 LOG_LEVEL=WARNING

Scaling Considerations

  • Use connection pooling for database
  • Implement Redis for rate limiting in multi-instance setups
  • Add load balancer for high availability
  • Monitor with Prometheus/Grafana

🤝 Contributing

We love contributions! Here's how to get started:

Development Setup

# Fork the repo, then: git clone https://github.com/your-username/AI-Customer-Support-Bot--MCP-Server.git cd AI-Customer-Support-Bot--MCP-Server # Create feature branch git checkout -b feature/amazing-feature # Make your changes # ... # Test your changes pytest # Submit PR

Contribution Guidelines

  • 📝 Write tests for new features
  • 📚 Update documentation
  • 🎨 Follow existing code style
  • ✅ Ensure CI passes

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.


Built with ❤️ by Chirag Patankar

Star this repo if you find it helpful!

Report BugRequest FeatureDocumentation

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A Model Context Protocol (MCP) server that provides AI-powered customer support using Cursor AI and Glama.ai integration.

  1. Features
    1. Prerequisites
      1. Installation
        1. Running the Server
          1. API Endpoints
            1. 1. Root Endpoint
            2. 2. MCP Version
            3. 3. Capabilities
            4. 4. Process Request
            5. 5. Batch Processing
            6. 6. Health Check
          2. Rate Limiting
            1. Error Handling
              1. Development
                1. Project Structure
                2. Adding New Features
              2. Security
                1. Monitoring
                  1. Contributing
                    1. Flowchart
                      1. Verification Badge
                        1. License
                          1. Support

                            Related MCP Servers

                            • A
                              security
                              A
                              license
                              A
                              quality
                              A foundation for building custom local Model Context Protocol (MCP) servers that provide tools accessible to AI assistants like Cursor or Claude Desktop.
                              Last updated -
                              1
                              29
                              MIT License
                            • A
                              security
                              A
                              license
                              A
                              quality
                              A modern Model Context Protocol (MCP) server that enables AI assistants to collect interactive user feedback, supporting text and image-based responses.
                              Last updated -
                              3
                              MIT License
                            • -
                              security
                              F
                              license
                              -
                              quality
                              A Model Context Protocol server that enables AI assistants to interact with a complete e-commerce application, providing authentication, product browsing, and shopping cart management through standardized MCP tools.
                              Last updated -

                            View all related MCP servers

                            MCP directory API

                            We provide all the information about MCP servers via our MCP API.

                            curl -X GET 'https://glama.ai/api/mcp/v1/servers/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server'

                            If you have feedback or need assistance with the MCP directory API, please join our Discord server