Integrations
Manages environment variables for API keys, database connections, and server configuration settings.
Version control system used for server codebase management and contribution workflow.
Stores user interactions, context metadata, and tracking data for the AI customer support system.
AI Customer Support Bot - MCP Server
A Model Context Protocol (MCP) server that provides AI-powered customer support using Cursor AI and Glama.ai integration.
Features
- Real-time context fetching from Glama.ai
- AI-powered response generation with Cursor AI
- Batch processing support
- Priority queuing
- Rate limiting
- User interaction tracking
- Health monitoring
- MCP protocol compliance
Prerequisites
- Python 3.8+
- PostgreSQL database
- Glama.ai API key
- Cursor AI API key
Installation
- Clone the repository:
- Create and activate a virtual environment:
- Install dependencies:
- Create a
.env
file based on.env.example
:
- Configure your
.env
file with your credentials:
- Set up the database:
Running the Server
Start the server:
The server will be available at http://localhost:8000
API Endpoints
1. Root Endpoint
Returns basic server information.
2. MCP Version
Returns supported MCP protocol versions.
3. Capabilities
Returns server capabilities and supported features.
4. Process Request
Process a single query with context.
Example request:
5. Batch Processing
Process multiple queries in a single request.
Example request:
6. Health Check
Check server health and service status.
Rate Limiting
The server implements rate limiting with the following defaults:
- 100 requests per 60 seconds
- Rate limit information is included in the health check endpoint
- Rate limit exceeded responses include reset time
Error Handling
The server returns structured error responses in the following format:
Common error codes:
RATE_LIMIT_EXCEEDED
: Rate limit exceededUNSUPPORTED_MCP_VERSION
: Unsupported MCP versionPROCESSING_ERROR
: Error processing requestCONTEXT_FETCH_ERROR
: Error fetching context from Glama.aiBATCH_PROCESSING_ERROR
: Error processing batch request
Development
Project Structure
Adding New Features
- Update
mcp_config.py
with new configuration options - Add new models in
models.py
if needed - Create new endpoints in
app.py
- Update capabilities endpoint to reflect new features
Security
- All MCP endpoints require authentication via
X-MCP-Auth
header - Rate limiting is implemented to prevent abuse
- Database credentials should be kept secure
- API keys should never be committed to version control
Monitoring
The server provides health check endpoints for monitoring:
- Service status
- Rate limit usage
- Connected services
- Processing times
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
License
[Your License Here]
Support
For support, please contact [Your Contact Information]
This server cannot be installed
A Model Context Protocol (MCP) server that provides AI-powered customer support using Cursor AI and Glama.ai integration.