README.mdβ’7.81 kB
# Azure AI MCP Server
A mission-critical Model Context Protocol (MCP) server providing comprehensive Azure AI services integration with enterprise-grade reliability, observability, and chaos engineering capabilities.
## π Features
### Core Azure AI Services
- **Azure OpenAI**: Chat completions, embeddings, and text generation
- **Cognitive Services Text Analytics**: Sentiment analysis, entity recognition, key phrase extraction
- **Computer Vision**: Image analysis, object detection, OCR
- **Face API**: Face detection, recognition, and analysis
- **Azure Storage**: Blob storage integration for data persistence
### Mission-Critical Capabilities
- **High Availability**: Multi-region deployment with automatic failover
- **Observability**: Comprehensive logging, metrics, and distributed tracing
- **Security**: Azure AD integration, API key management, and encryption at rest/transit
- **Rate Limiting**: Intelligent throttling and backpressure handling
- **Retry Logic**: Exponential backoff with circuit breaker patterns
- **Chaos Engineering**: Built-in chaos testing with Azure Chaos Studio
### DevOps & CI/CD
- **Infrastructure as Code**: Terraform modules for all environments
- **Multi-Environment**: Integration, E2E, and Production pipelines
- **Container Support**: Docker containerization with health checks
- **Monitoring**: Azure Monitor, Application Insights integration
- **Security Scanning**: Automated vulnerability assessments
## π Prerequisites
- Node.js 18+
- Azure subscription with appropriate permissions
- Azure CLI installed and configured
- Terraform 1.5+
- Docker (optional, for containerized deployment)
## π§ Installation
### 1. Clone and Setup
```bash
git clone https://github.com/caiotk/nexguideai-azure-ai-mcp-server.git
cd azure-ai-mcp-server
npm install
```
### 2. Environment Configuration
Copy the environment template and configure your Azure credentials:
```bash
cp .env.example .env
```
Required environment variables:
```env
# Azure OpenAI
AZURE_OPENAI_ENDPOINT=https://your-openai.openai.azure.com/
AZURE_OPENAI_API_KEY=your-api-key
# Azure Cognitive Services
AZURE_COGNITIVE_SERVICES_ENDPOINT=https://your-region.api.cognitive.microsoft.com/
AZURE_COGNITIVE_SERVICES_KEY=your-key
# Azure Storage
AZURE_STORAGE_CONNECTION_STRING=your-connection-string
# Azure AD (for production)
AZURE_TENANT_ID=your-tenant-id
AZURE_CLIENT_ID=your-client-id
AZURE_CLIENT_SECRET=your-client-secret
# Monitoring
AZURE_APPLICATION_INSIGHTS_CONNECTION_STRING=your-connection-string
LOG_LEVEL=info
```
### 3. Build and Run
```bash
npm run build
npm start
```
## ποΈ Architecture
### System Overview
```
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β MCP Client ββββββ Azure AI MCP ββββββ Azure Services β
β β β Server β β β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β
βΌ
βββββββββββββββββββ
β Observability β
β & Monitoring β
βββββββββββββββββββ
```
### Component Architecture
- **API Layer**: MCP protocol implementation with request validation
- **Service Layer**: Azure AI service integrations with retry logic
- **Infrastructure Layer**: Terraform modules for cloud resources
- **Observability Layer**: Logging, metrics, and distributed tracing
## π CI/CD Pipeline
### Environments
1. **Integration (INT)**: Feature testing and integration validation
2. **End-to-End (E2E)**: Full system testing with chaos engineering
3. **Production (PROD)**: Live environment with blue-green deployment
### Pipeline Stages
```yaml
Build β Test β Security Scan β Deploy INT β E2E Tests β Chaos Tests β Deploy PROD
```
### Deployment Strategy
- **Blue-Green Deployment**: Zero-downtime deployments
- **Canary Releases**: Gradual traffic shifting for risk mitigation
- **Automated Rollback**: Automatic rollback on health check failures
## π§ͺ Testing Strategy
### Test Pyramid
- **Unit Tests**: Individual component testing (Jest)
- **Integration Tests**: Service integration validation
- **E2E Tests**: Full workflow testing
- **Chaos Tests**: Resilience and failure scenario testing
### Chaos Engineering
Integrated with Azure Chaos Studio for:
- **Service Disruption**: Simulated Azure service outages
- **Network Latency**: Increased response times
- **Resource Exhaustion**: CPU/Memory pressure testing
- **Dependency Failures**: External service failures
## π Monitoring & Observability
### Metrics
- **Performance**: Response times, throughput, error rates
- **Business**: API usage, feature adoption, cost optimization
- **Infrastructure**: Resource utilization, availability
### Logging
- **Structured Logging**: JSON format with correlation IDs
- **Log Levels**: ERROR, WARN, INFO, DEBUG
- **Centralized**: Azure Log Analytics integration
### Alerting
- **SLA Monitoring**: 99.9% availability target
- **Error Rate Thresholds**: >1% error rate alerts
- **Performance Degradation**: Response time anomalies
## π Security
### Authentication & Authorization
- **Azure AD Integration**: Enterprise identity management
- **API Key Management**: Secure key rotation and storage
- **RBAC**: Role-based access control
### Data Protection
- **Encryption at Rest**: Azure Storage encryption
- **Encryption in Transit**: TLS 1.3 for all communications
- **Data Residency**: Configurable data location compliance
### Security Scanning
- **Dependency Scanning**: Automated vulnerability detection
- **SAST**: Static application security testing
- **Container Scanning**: Docker image vulnerability assessment
## π Deployment
### Local Development
```bash
npm run dev
```
### Docker Deployment
```bash
npm run docker:build
npm run docker:run
```
### Terraform Deployment
```bash
cd terraform/environments/prod
terraform init
terraform plan
terraform apply
```
## π Performance
### Benchmarks
- **Latency**: P95 < 500ms for chat completions
- **Throughput**: 1000+ requests/minute sustained
- **Availability**: 99.9% uptime SLA
### Optimization
- **Connection Pooling**: Efficient Azure service connections
- **Caching**: Intelligent response caching strategies
- **Rate Limiting**: Adaptive throttling based on service limits
## π€ Contributing
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
### Development Guidelines
- Follow TypeScript strict mode
- Maintain 90%+ test coverage
- Use conventional commits
- Update documentation for new features
## π License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## π Support
- **Documentation**: [docs/](./docs/)
- **Issues**: [GitHub Issues](https://github.com/caiotk/nexguideai-azure-ai-mcp-server/issues)
- **Discussions**: [GitHub Discussions](https://github.com/caiotk/nexguideai-azure-ai-mcp-server/discussions)
## πΊοΈ Roadmap
- [ ] Multi-model support (GPT-4, Claude, Gemini)
- [ ] Advanced caching strategies
- [ ] GraphQL API support
- [ ] Kubernetes deployment manifests
- [ ] Advanced chaos engineering scenarios
- [ ] Cost optimization recommendations
---
**Built with β€οΈ by NexGuide AI**