Used for configuration management to securely store API keys, database connection strings, and other environment-specific settings.
Used for version control and deployment of the MCP server codebase.
Used as the database backend for storing user interactions and tracking data for the AI customer support system.
Primary programming language used to build the MCP server application and its components.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@AI Customer Support Bot - MCP ServerMy account was hacked, how do I recover it?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
π€ AI Customer Support Bot - MCP Server
A modern, extensible MCP server framework for building AI-powered customer support systems
Features β’ Quick Start β’ API Reference β’ Architecture β’ Contributing
π Overview
A Model Context Protocol (MCP) compliant server framework built with modern Python. Designed for developers who want to create intelligent customer support systems without vendor lock-in. Clean architecture, battle-tested patterns, and ready for any AI provider.
graph TB
Client[HTTP Client] --> API[API Server]
API --> MW[Middleware Layer]
MW --> SVC[Service Layer]
SVC --> CTX[Context Manager]
SVC --> AI[AI Integration]
SVC --> DAL[Data Access Layer]
DAL --> DB[(PostgreSQL)]Related MCP server: MCP Starter
β¨ Features
ποΈ Clean Architecture
Layered design with clear separation of concerns
π‘ MCP Compliant
Full Model Context Protocol implementation
π Production Ready
Auth, rate limiting, monitoring included
π High Performance
Built on FastAPI with async support
π AI Agnostic
Integrate any AI provider easily
π Health Monitoring
Comprehensive metrics and diagnostics
π‘οΈ Secure by Default
Token auth and input validation
π¦ Batch Processing
Handle multiple queries efficiently
π Quick Start
Prerequisites
Python 3.8+
PostgreSQL
Your favorite AI service (OpenAI, Anthropic, etc.)
Installation
# Clone and setup
git clone https://github.com/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server.git
cd AI-Customer-Support-Bot--MCP-Server
# Create virtual environment
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Setup environment
cp .env.example .env
# Edit .env with your configurationConfiguration
# .env file
DATABASE_URL=postgresql://user:password@localhost/customer_support_bot
SECRET_KEY=your-super-secret-key
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_PERIOD=60Run
# Setup database
createdb customer_support_bot
# Start server
python app.py
# π Server running at http://localhost:8000π‘ API Reference
Health Check
GET /mcp/healthProcess Single Query
POST /mcp/process
Content-Type: application/json
X-MCP-Auth: your-token
X-MCP-Version: 1.0
{
"query": "How do I reset my password?",
"priority": "high"
}Batch Processing
POST /mcp/batch
Content-Type: application/json
X-MCP-Auth: your-token
{
"queries": [
"How do I reset my password?",
"What are your business hours?"
]
}Success Response
{
"status": "success",
"data": {
"response": "Generated AI response",
"confidence": 0.95,
"processing_time": "120ms"
},
"meta": {
"request_id": "req_123456",
"timestamp": "2024-02-14T12:00:00Z"
}
}Error Response
{
"code": "RATE_LIMIT_EXCEEDED",
"message": "Rate limit exceeded",
"details": {
"retry_after": 60,
"timestamp": "2024-02-14T12:00:00Z"
}
}ποΈ Architecture
Project Structure
π¦ AI-Customer-Support-Bot--MCP-Server
βββ π app.py # FastAPI application
βββ ποΈ database.py # Database configuration
βββ π‘οΈ middleware.py # Auth & rate limiting
βββ π models.py # ORM models
βββ βοΈ mcp_config.py # MCP protocol config
βββ π requirements.txt # Dependencies
βββ π .env.example # Environment templateLayer Responsibilities
Layer | Purpose | Components |
API | HTTP endpoints, validation | FastAPI routes, Pydantic models |
Middleware | Auth, rate limiting, logging | Token validation, request throttling |
Service | Business logic, AI integration | Context management, AI orchestration |
Data | Persistence, models | PostgreSQL, SQLAlchemy ORM |
π Extending with AI Services
Add Your AI Provider
Install your AI SDK:
pip install openai # or anthropic, cohere, etc.Configure environment:
# Add to .env
AI_SERVICE_API_KEY=sk-your-api-key
AI_SERVICE_MODEL=gpt-4Implement service integration:
# In service layer
class AIService:
async def generate_response(self, query: str, context: dict) -> str:
# Your AI integration here
return ai_responseπ§ Development
Running Tests
pytest tests/Code Quality
# Format code
black .
# Lint
flake8
# Type checking
mypy .Docker Support
# Coming soon - Docker containerizationπ Monitoring & Observability
Health Metrics
β Service uptime
π Database connectivity
π Request rates
β±οΈ Response times
πΎ Memory usage
Logging
# Structured logging included
{
"timestamp": "2024-02-14T12:00:00Z",
"level": "INFO",
"message": "Query processed",
"request_id": "req_123456",
"processing_time": 120
}π Security
Built-in Security Features
π Token Authentication - Secure API access
π‘οΈ Rate Limiting - DoS protection
β Input Validation - SQL injection prevention
π Audit Logging - Request tracking
π Environment Secrets - Secure config management
π Deployment
Environment Setup
# Production environment variables
DATABASE_URL=postgresql://prod-user:password@prod-host/db
RATE_LIMIT_REQUESTS=1000
LOG_LEVEL=WARNINGScaling Considerations
Use connection pooling for database
Implement Redis for rate limiting in multi-instance setups
Add load balancer for high availability
Monitor with Prometheus/Grafana
π€ Contributing
We love contributions! Here's how to get started:
Development Setup
# Fork the repo, then:
git clone https://github.com/your-username/AI-Customer-Support-Bot--MCP-Server.git
cd AI-Customer-Support-Bot--MCP-Server
# Create feature branch
git checkout -b feature/amazing-feature
# Make your changes
# ...
# Test your changes
pytest
# Submit PRContribution Guidelines
π Write tests for new features
π Update documentation
π¨ Follow existing code style
β Ensure CI passes
π License
This project is licensed under the MIT License - see the LICENSE file for details.
Built with β€οΈ by
β Star this repo if you find it helpful! β
Report Bug β’ Request Feature β’ Documentation