Manages environment variables for API keys, database connections, and server configuration settings.
Version control system used for server codebase management and contribution workflow.
Stores user interactions, context metadata, and tracking data for the AI customer support system.
Runtime environment for the MCP server with support for batch processing, priority queuing, and rate limiting.
AI Customer Support Bot - MCP Server
A Model Context Protocol (MCP) server that provides AI-powered customer support using Cursor AI and Glama.ai integration.
Features
Real-time context fetching from Glama.ai
AI-powered response generation with Cursor AI
Batch processing support
Priority queuing
Rate limiting
User interaction tracking
Health monitoring
MCP protocol compliance
Related MCP server: MCP Starter
Prerequisites
Python 3.8+
PostgreSQL database
Glama.ai API key
Cursor AI API key
Installation
Clone the repository:
Create and activate a virtual environment:
Install dependencies:
Create a
.envfile based on.env.example:
Configure your
.envfile with your credentials:
Set up the database:
Running the Server
Start the server:
The server will be available at http://localhost:8000
API Endpoints
1. Root Endpoint
Returns basic server information.
2. MCP Version
Returns supported MCP protocol versions.
3. Capabilities
Returns server capabilities and supported features.
4. Process Request
Process a single query with context.
Example request:
5. Batch Processing
Process multiple queries in a single request.
Example request:
6. Health Check
Check server health and service status.
Rate Limiting
The server implements rate limiting with the following defaults:
100 requests per 60 seconds
Rate limit information is included in the health check endpoint
Rate limit exceeded responses include reset time
Error Handling
The server returns structured error responses in the following format:
Common error codes:
RATE_LIMIT_EXCEEDED: Rate limit exceededUNSUPPORTED_MCP_VERSION: Unsupported MCP versionPROCESSING_ERROR: Error processing requestCONTEXT_FETCH_ERROR: Error fetching context from Glama.aiBATCH_PROCESSING_ERROR: Error processing batch request
Development
Project Structure
Adding New Features
Update
mcp_config.pywith new configuration optionsAdd new models in
models.pyif neededCreate new endpoints in
app.pyUpdate capabilities endpoint to reflect new features
Security
All MCP endpoints require authentication via
X-MCP-AuthheaderRate limiting is implemented to prevent abuse
Database credentials should be kept secure
API keys should never be committed to version control
Monitoring
The server provides health check endpoints for monitoring:
Service status
Rate limit usage
Connected services
Processing times
Contributing
Fork the repository
Create a feature branch
Commit your changes
Push to the branch
Create a Pull Request
Flowchart

Verification Badge
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
For support, please create an issue in the repository or contact the development team.