MCP Proof of Concept
This repository contains a sophisticated Model Context Protocol (MCP) server implemented with FastAPI. The project demonstrates advanced API design patterns including OAuth2 authentication, dynamic API discovery, and comprehensive herd management capabilities that can be deployed to AWS Fargate.
Features
🔐 Authentication & Security
- OAuth2 with JWT: Bearer token authentication system
- User Management: Built-in user database with role-based access
- Protected Endpoints: Secure API operations requiring authentication
- Development Mode: Backwards compatibility with simple token authentication
🔍 Enhanced MCP Agent
- Dynamic API Discovery: Automatically discovers endpoints from OpenAPI metadata
- Universal Operation Support: Execute any discovered API operation
- Fallback Compatibility: Graceful degradation to static YAML configuration
- 10x More Operations: Comprehensive endpoint coverage vs static configuration
🤖 OpenAI Integration
- GPT-4o-mini Model: Fast, cost-effective AI powered by OpenAI's latest mini model
- Intelligent Conversations: Natural language interface with advanced reasoning
- Smart MCP Operations: AI understands user intent and executes appropriate MCP operations
- Conversational Context: Maintains conversation history for better interactions
- Model Flexibility: Switch between GPT-4o-mini, GPT-4o, and other models
- API Endpoints: RESTful endpoints for chat and intelligent query capabilities
💬 Interactive Interfaces
- Interactive CLI: Full-featured command-line interface with streaming responses
- Web Interface: Modern web-based chat interface for browser interaction
- Unified Database: Both interfaces use the same MCP database and authentication
- Session Management: User-specific conversation saving and restoration
- Real-time Streaming: Live response streaming for immediate feedback
- Tab Completion: Smart completion for commands and tool names
📊 Comprehensive API
- CRUD Operations: Full herd management (Create, Read, Update, Delete)
- Search & Filtering: Search herds by name with flexible parameters
- Pagination: Efficient data retrieval with skip/limit controls
- Statistics: Real-time herd analytics and reporting
- Health Monitoring: System health checks and database status
🚀 Advanced Operations
- MCP Execute: Protected operation execution endpoint
- Broadcast System: Message broadcasting to connected clients
- Model Listing: Available MCP models and capabilities
- Real-time Analytics: Live system statistics and performance metrics
Quick Start
- Install dependencies:
- Configure OpenAI (Optional):
- Seed the database:
- Start the server:
- Get an access token:
- Try the interactive agents:
The database path can be configured via the DATABASE_PATH
environment variable.
If not set, it defaults to mcp.db
in the working directory.
AI Model Configuration
The system now uses GPT-4o-mini as the default model for optimal performance and cost:
🤖 Current Model: GPT-4o-mini
- Performance: ~2x faster than GPT-4 for most tasks
- Cost: Significantly lower cost per token
- Quality: High-quality responses for interactive chat and MCP operations
- Availability: Latest OpenAI model with improved efficiency
🔄 Model Switching
- CLI: Use
/model <name>
to switch models - Available models: gpt-4o-mini, gpt-4o, gpt-4-turbo, gpt-3.5-turbo
- Session persistent: Model choice saves across CLI sessions
- Web interface: Uses gpt-4o-mini by default
💡 Model Recommendations
- gpt-4o-mini: Best for interactive chat and general tasks (default)
- gpt-4o: Use for complex reasoning tasks
- gpt-3.5-turbo: Fallback for basic interactions
Unified Database System
Both the CLI and web interfaces now use the same MCP database for authentication and operations:
🔑 Shared Authentication
- Both interfaces authenticate against the same user database
- Same JWT token system for both CLI and web
- Consistent user management across interfaces
💾 Database Consistency
- User accounts: Shared between CLI and web interface
- MCP operations: Both use identical API endpoints
- Tool discovery: Same tool set available in both interfaces
- Conversation history: Interface-specific but user-isolated
🔄 Cross-Interface Benefits
- Login with
johndoe/secret
oralice/wonderland
in both interfaces - MCP operations executed through web interface affect same database as CLI
- Consistent tool availability and functionality
- Unified session management per user
Authentication
Available Users
- johndoe / secret - Active user account
- alice / wonderland - Active user account
OAuth2 Flow
Enhanced Agent Usage
Dynamic Discovery Mode (Recommended)
Static Configuration Mode
Enhanced Agent Demo
Experience the full capabilities:
This demonstrates:
- Dynamic vs static discovery comparison
- All available operations execution
- Error handling and fallback mechanisms
- Real-time herd management operations
Interactive Agent Demos
Experience the AI-powered interactive agents:
CLI Interface:
Features: Real-time chat, streaming responses, command system, tab completion
Web Interface:
Features: Modern web UI, MCP database authentication, real-time messaging
Feature Demo:
This demonstrates:
- OpenAI integration and chat capabilities
- Intelligent MCP operation execution
- Natural language query processing
- API endpoint usage examples
API Operations
Core Herd Management
GET /api/v1/herd
- List herds with paginationPOST /api/v1/herd
- Create new herdGET /api/v1/herd/{id}
- Get specific herdPUT /api/v1/herd/{id}
- Update herdDELETE /api/v1/herd/{id}
- Delete herd
Advanced Features
GET /api/v1/herd/search/name
- Search herds by nameGET /api/v1/herd/stats
- Get herd statisticsGET /api/v1/health
- System health check
Protected MCP Operations
POST /api/v1/mcp/execute
- Execute MCP operationsPOST /api/v1/mcp/broadcast
- Broadcast messagesGET /api/v1/mcp/models
- List available models
OpenAI Agent Endpoints
POST /api/v1/agent/chat
- Direct chat with OpenAI agentPOST /api/v1/agent/query
- Intelligent MCP queries with natural languageGET /api/v1/agent/capabilities
- Agent capabilities and configurationGET /api/v1/agent/tools
- Available MCP toolsGET /api/v1/agent/status
- Agent operational status
Authentication Endpoints
POST /api/v1/token
- OAuth2 token loginGET /api/v1/users/me
- Current user infoGET /api/v1/users/me/profile
- Detailed profile
Testing
Container Deployment
Infrastructure
The terraform/
directory provides AWS deployment configuration:
- ECR Repository: Container image storage
- Fargate Service: Serverless container deployment
- Load Balancer: High availability and scaling
Documentation
- Enhanced Agent Guide - Comprehensive agent capabilities and usage
- Architecture Overview - System design and patterns
- OpenAPI Docs: Available at
/docs
when server is running
Project Structure
The MCP discovery file is available at model_context.yaml
for static configuration.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
ローカルおよびコンテナ化されたデプロイメント オプションを使用して、検出可能な API を通じて集団データを公開する FastAPI ベースのモデル コンテキスト プロトコル サーバー。
Related MCP Servers
- -securityFlicense-qualityProvides a scalable, containerized infrastructure for deploying and managing Model Context Protocol servers with monitoring, high availability, and secure configurations.Last updated -
- -securityAlicense-qualityA high-performance FastAPI server supporting Model Context Protocol (MCP) for seamless integration with Large Language Models, featuring REST, GraphQL, and WebSocket APIs, along with real-time monitoring and vector search capabilities.Last updated -9PythonMIT License
- -securityAlicense-qualityA FastAPI-based implementation of the Model Context Protocol that enables standardized interaction between AI models and development environments, making it easier for developers to integrate and manage AI tasks.Last updated -10PythonMIT License
- -securityFlicense-qualityA FastAPI server that implements the Model Context Protocol (MCP) using Server-Sent Events (SSE) transport to provide random cat facts on demand or as a continuous stream.Last updated -Python