# π€ AI Customer Support Bot - MCP Server
<div align="center">





*A modern, extensible MCP server framework for building AI-powered customer support systems*
[Features](#-features) β’ [Quick Start](#-quick-start) β’ [API Reference](#-api-reference) β’ [Architecture](#-architecture) β’ [Contributing](#-contributing)
</div>
---
## π Overview
A **Model Context Protocol (MCP)** compliant server framework built with modern Python. Designed for developers who want to create intelligent customer support systems without vendor lock-in. Clean architecture, battle-tested patterns, and ready for any AI provider.
```mermaid
graph TB
Client[HTTP Client] --> API[API Server]
API --> MW[Middleware Layer]
MW --> SVC[Service Layer]
SVC --> CTX[Context Manager]
SVC --> AI[AI Integration]
SVC --> DAL[Data Access Layer]
DAL --> DB[(PostgreSQL)]
```
## β¨ Features
<table>
<tr>
<td>
ποΈ **Clean Architecture**
Layered design with clear separation of concerns
π‘ **MCP Compliant**
Full Model Context Protocol implementation
</td>
<td>
π **Production Ready**
Auth, rate limiting, monitoring included
π **High Performance**
Built on FastAPI with async support
</td>
</tr>
<tr>
<td>
π **AI Agnostic**
Integrate any AI provider easily
π **Health Monitoring**
Comprehensive metrics and diagnostics
</td>
<td>
π‘οΈ **Secure by Default**
Token auth and input validation
π¦ **Batch Processing**
Handle multiple queries efficiently
</td>
</tr>
</table>
## π Quick Start
### Prerequisites
- Python 3.8+
- PostgreSQL
- Your favorite AI service (OpenAI, Anthropic, etc.)
### Installation
```bash
# Clone and setup
git clone https://github.com/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server.git
cd AI-Customer-Support-Bot--MCP-Server
# Create virtual environment
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Setup environment
cp .env.example .env
# Edit .env with your configuration
```
### Configuration
```bash
# .env file
DATABASE_URL=postgresql://user:password@localhost/customer_support_bot
SECRET_KEY=your-super-secret-key
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_PERIOD=60
```
### Run
```bash
# Setup database
createdb customer_support_bot
# Start server
python app.py
# π Server running at http://localhost:8000
```
## π‘ API Reference
<details>
<summary><strong>Core Endpoints</strong></summary>
### Health Check
```http
GET /mcp/health
```
### Process Single Query
```http
POST /mcp/process
Content-Type: application/json
X-MCP-Auth: your-token
X-MCP-Version: 1.0
{
"query": "How do I reset my password?",
"priority": "high"
}
```
### Batch Processing
```http
POST /mcp/batch
Content-Type: application/json
X-MCP-Auth: your-token
{
"queries": [
"How do I reset my password?",
"What are your business hours?"
]
}
```
</details>
<details>
<summary><strong>Response Format</strong></summary>
### Success Response
```json
{
"status": "success",
"data": {
"response": "Generated AI response",
"confidence": 0.95,
"processing_time": "120ms"
},
"meta": {
"request_id": "req_123456",
"timestamp": "2024-02-14T12:00:00Z"
}
}
```
### Error Response
```json
{
"code": "RATE_LIMIT_EXCEEDED",
"message": "Rate limit exceeded",
"details": {
"retry_after": 60,
"timestamp": "2024-02-14T12:00:00Z"
}
}
```
</details>
## ποΈ Architecture
### Project Structure
```
π¦ AI-Customer-Support-Bot--MCP-Server
βββ π app.py # FastAPI application
βββ ποΈ database.py # Database configuration
βββ π‘οΈ middleware.py # Auth & rate limiting
βββ π models.py # ORM models
βββ βοΈ mcp_config.py # MCP protocol config
βββ π requirements.txt # Dependencies
βββ π .env.example # Environment template
```
### Layer Responsibilities
| Layer | Purpose | Components |
|-------|---------|------------|
| **API** | HTTP endpoints, validation | FastAPI routes, Pydantic models |
| **Middleware** | Auth, rate limiting, logging | Token validation, request throttling |
| **Service** | Business logic, AI integration | Context management, AI orchestration |
| **Data** | Persistence, models | PostgreSQL, SQLAlchemy ORM |
## π Extending with AI Services
### Add Your AI Provider
1. **Install your AI SDK:**
```bash
pip install openai # or anthropic, cohere, etc.
```
2. **Configure environment:**
```bash
# Add to .env
AI_SERVICE_API_KEY=sk-your-api-key
AI_SERVICE_MODEL=gpt-4
```
3. **Implement service integration:**
```python
# In service layer
class AIService:
async def generate_response(self, query: str, context: dict) -> str:
# Your AI integration here
return ai_response
```
## π§ Development
### Running Tests
```bash
pytest tests/
```
### Code Quality
```bash
# Format code
black .
# Lint
flake8
# Type checking
mypy .
```
### Docker Support
```dockerfile
# Coming soon - Docker containerization
```
## π Monitoring & Observability
### Health Metrics
- β
Service uptime
- π Database connectivity
- π Request rates
- β±οΈ Response times
- πΎ Memory usage
### Logging
```python
# Structured logging included
{
"timestamp": "2024-02-14T12:00:00Z",
"level": "INFO",
"message": "Query processed",
"request_id": "req_123456",
"processing_time": 120
}
```
## π Security
### Built-in Security Features
- π **Token Authentication** - Secure API access
- π‘οΈ **Rate Limiting** - DoS protection
- β
**Input Validation** - SQL injection prevention
- π **Audit Logging** - Request tracking
- π **Environment Secrets** - Secure config management
## π Deployment
### Environment Setup
```bash
# Production environment variables
DATABASE_URL=postgresql://prod-user:password@prod-host/db
RATE_LIMIT_REQUESTS=1000
LOG_LEVEL=WARNING
```
### Scaling Considerations
- Use connection pooling for database
- Implement Redis for rate limiting in multi-instance setups
- Add load balancer for high availability
- Monitor with Prometheus/Grafana
## π€ Contributing
We love contributions! Here's how to get started:
### Development Setup
```bash
# Fork the repo, then:
git clone https://github.com/your-username/AI-Customer-Support-Bot--MCP-Server.git
cd AI-Customer-Support-Bot--MCP-Server
# Create feature branch
git checkout -b feature/amazing-feature
# Make your changes
# ...
# Test your changes
pytest
# Submit PR
```
### Contribution Guidelines
- π Write tests for new features
- π Update documentation
- π¨ Follow existing code style
- β
Ensure CI passes
[](https://mseep.ai/app/chiragpatankar-ai-customer-support-bot-mcp-server)
## π License
This project is licensed under the **MIT License** - see the [LICENSE](LICENSE) file for details.
---
<div align="center">
**Built with β€οΈ by [Chirag Patankar](https://github.com/ChiragPatankar)**
β **Star this repo if you find it helpful!** β
[Report Bug](https://github.com/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server/issues) β’ [Request Feature](https://github.com/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server/issues) β’ [Documentation](https://github.com/ChiragPatankar/AI-Customer-Support-Bot--MCP-Server/wiki)
</div>