Provides deployment target for the MCP server through a minimal Terraform configuration, allowing the container to be deployed as a scalable, serverless compute service.
Enables containerization of the MCP server using a provided Dockerfile, making it portable and deployable across different environments.
Implements the Model Context Protocol server using FastAPI framework, providing a discoverable, versioned API for exposing herd data.
Offers test capabilities for the MCP server implementation, allowing verification of functionality through automated testing.
Uses SQLite as the database backend for storing herd data, with configurable database path via environment variables.
Provides infrastructure-as-code configuration for deploying the MCP server to AWS, including creating an ECR repository for the container image.
Uses YAML for the MCP discovery file (model_context.yaml) which enables API discovery and interaction through the provided agent.
MCP Proof of Concept
This repository contains a sophisticated Model Context Protocol (MCP) server implemented with FastAPI. The project demonstrates advanced API design patterns including OAuth2 authentication, dynamic API discovery, and comprehensive herd management capabilities that can be deployed to AWS Fargate.
Features
š Authentication & Security
OAuth2 with JWT: Bearer token authentication system
User Management: Built-in user database with role-based access
Protected Endpoints: Secure API operations requiring authentication
Development Mode: Backwards compatibility with simple token authentication
š Enhanced MCP Agent
Dynamic API Discovery: Automatically discovers endpoints from OpenAPI metadata
Universal Operation Support: Execute any discovered API operation
Fallback Compatibility: Graceful degradation to static YAML configuration
10x More Operations: Comprehensive endpoint coverage vs static configuration
š¤ OpenAI Integration
GPT-4o-mini Model: Fast, cost-effective AI powered by OpenAI's latest mini model
Intelligent Conversations: Natural language interface with advanced reasoning
Smart MCP Operations: AI understands user intent and executes appropriate MCP operations
Conversational Context: Maintains conversation history for better interactions
Model Flexibility: Switch between GPT-4o-mini, GPT-4o, and other models
API Endpoints: RESTful endpoints for chat and intelligent query capabilities
š¬ Interactive Interfaces
Interactive CLI: Full-featured command-line interface with streaming responses
Web Interface: Modern web-based chat interface for browser interaction
Unified Database: Both interfaces use the same MCP database and authentication
Session Management: User-specific conversation saving and restoration
Real-time Streaming: Live response streaming for immediate feedback
Tab Completion: Smart completion for commands and tool names
š Comprehensive API
CRUD Operations: Full herd management (Create, Read, Update, Delete)
Search & Filtering: Search herds by name with flexible parameters
Pagination: Efficient data retrieval with skip/limit controls
Statistics: Real-time herd analytics and reporting
Health Monitoring: System health checks and database status
š Advanced Operations
MCP Execute: Protected operation execution endpoint
Broadcast System: Message broadcasting to connected clients
Model Listing: Available MCP models and capabilities
Real-time Analytics: Live system statistics and performance metrics
Related MCP server: MyAIServ MCP Server
Quick Start
Install dependencies:
pip install -r requirements.txtConfigure OpenAI (Optional):
export OPENAI_API_KEY="your-openai-api-key-here"Seed the database:
python -m app.seedStart the server:
uvicorn app.main:app --reloadGet an access token:
curl -X POST "http://localhost:8000/api/v1/token" \ -H "Content-Type: application/x-www-form-urlencoded" \ -d "username=johndoe&password=secret"Try the interactive agents:
# Interactive CLI (uses MCP database authentication) python interactive_agent.py # Web interface (in separate terminal, same database) python web_interface.py # Visit http://localhost:8080 and login with johndoe/secret # Demo all features python demo_agent_usage.py # Test unified database access python test_unified_database.py
The database path can be configured via the DATABASE_PATH environment variable.
If not set, it defaults to mcp.db in the working directory.
AI Model Configuration
The system now uses GPT-4o-mini as the default model for optimal performance and cost:
š¤ Current Model: GPT-4o-mini
Performance: ~2x faster than GPT-4 for most tasks
Cost: Significantly lower cost per token
Quality: High-quality responses for interactive chat and MCP operations
Availability: Latest OpenAI model with improved efficiency
š Model Switching
CLI: Use
/model <name>to switch modelsAvailable models: gpt-4o-mini, gpt-4o, gpt-4-turbo, gpt-3.5-turbo
Session persistent: Model choice saves across CLI sessions
Web interface: Uses gpt-4o-mini by default
š” Model Recommendations
gpt-4o-mini: Best for interactive chat and general tasks (default)
gpt-4o: Use for complex reasoning tasks
gpt-3.5-turbo: Fallback for basic interactions
Unified Database System
Both the CLI and web interfaces now use the same MCP database for authentication and operations:
š Shared Authentication
Both interfaces authenticate against the same user database
Same JWT token system for both CLI and web
Consistent user management across interfaces
š¾ Database Consistency
User accounts: Shared between CLI and web interface
MCP operations: Both use identical API endpoints
Tool discovery: Same tool set available in both interfaces
Conversation history: Interface-specific but user-isolated
š Cross-Interface Benefits
Login with
johndoe/secretoralice/wonderlandin both interfacesMCP operations executed through web interface affect same database as CLI
Consistent tool availability and functionality
Unified session management per user
Authentication
Available Users
johndoe / secret - Active user account
alice / wonderland - Active user account
OAuth2 Flow
Enhanced Agent Usage
Dynamic Discovery Mode (Recommended)
Static Configuration Mode
Enhanced Agent Demo
Experience the full capabilities:
This demonstrates:
Dynamic vs static discovery comparison
All available operations execution
Error handling and fallback mechanisms
Real-time herd management operations
Interactive Agent Demos
Experience the AI-powered interactive agents:
CLI Interface:
Features: Real-time chat, streaming responses, command system, tab completion
Web Interface:
Features: Modern web UI, MCP database authentication, real-time messaging
Feature Demo:
This demonstrates:
OpenAI integration and chat capabilities
Intelligent MCP operation execution
Natural language query processing
API endpoint usage examples
API Operations
Core Herd Management
GET /api/v1/herd- List herds with paginationPOST /api/v1/herd- Create new herdGET /api/v1/herd/{id}- Get specific herdPUT /api/v1/herd/{id}- Update herdDELETE /api/v1/herd/{id}- Delete herd
Advanced Features
GET /api/v1/herd/search/name- Search herds by nameGET /api/v1/herd/stats- Get herd statisticsGET /api/v1/health- System health check
Protected MCP Operations
POST /api/v1/mcp/execute- Execute MCP operationsPOST /api/v1/mcp/broadcast- Broadcast messagesGET /api/v1/mcp/models- List available models
OpenAI Agent Endpoints
POST /api/v1/agent/chat- Direct chat with OpenAI agentPOST /api/v1/agent/query- Intelligent MCP queries with natural languageGET /api/v1/agent/capabilities- Agent capabilities and configurationGET /api/v1/agent/tools- Available MCP toolsGET /api/v1/agent/status- Agent operational status
Authentication Endpoints
POST /api/v1/token- OAuth2 token loginGET /api/v1/users/me- Current user infoGET /api/v1/users/me/profile- Detailed profile
Testing
Container Deployment
Infrastructure
The terraform/ directory provides AWS deployment configuration:
ECR Repository: Container image storage
Fargate Service: Serverless container deployment
Load Balancer: High availability and scaling
Documentation
Enhanced Agent Guide - Comprehensive agent capabilities and usage
Architecture Overview - System design and patterns
OpenAPI Docs: Available at
/docswhen server is running
Project Structure
The MCP discovery file is available at model_context.yaml for static configuration.