README.mdβ’11 kB
# π MCP Agentic AI Server Project
[](https://www.python.org/downloads/)
[](https://flask.palletsprojects.com/)
[](https://streamlit.io/)
[](https://ai.google.dev/)
[](LICENSE)
> **A comprehensive Model Context Protocol (MCP) implementation featuring dual AI server architecture, real-time monitoring, and an interactive dashboard.**
## π Project Overview
This project demonstrates a production-ready **MCP (Model Context Protocol) Agentic AI Server** system with:
- π§ **Custom MCP Server** - Task-based AI processing with tool integration
- π **Public MCP Server** - Direct AI query processing
- π¨ **Interactive Dashboard** - Real-time monitoring and user interface
- π **Live Statistics** - Performance metrics and analytics
- π οΈ **Extensible Tools** - Modular tool framework for custom functionality
## ποΈ Architecture
```
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β π¨ Streamlit Dashboard β
β (Port 8501) β
βββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββ΄ββββββββββββββββ
βΌ βΌ
βββββββββββββββββββββ ββββββββββββββββββββ
β π§ Custom MCP β β π Public MCP β
β (Port 8000) β β (Port 8001) β
β β β β
β β’ Task Creation β β β’ Direct Queries β
β β’ Tool Integrationβ β β’ Simple AI Chat β
β β’ Async Processingβ β β’ Real-time Statsβ
βββββββββββββββββββββ ββββββββββββββββββββ
β β
βββββββββββββββ¬ββββββββββββββ
βΌ
βββββββββββββββββββ
β π§ Google β
β Gemini API β
βββββββββββββββββββ
```
## π Quick Start
### Prerequisites
- **Python 3.12+** (Conda environment recommended)
- **Google Gemini API Key** ([Get one here](https://ai.google.dev/))
- **Git** for cloning the repository
### 1. Clone & Setup
```bash
# Clone the repository
git clone <repository-url>
cd mcp_server_project
# Create and activate virtual environment (recommended)
conda create -n mcp_env python=3.12
conda activate mcp_env
# Install dependencies
pip install -r requirements.txt
```
### 2. Environment Configuration
Create a `.env` file in the project root:
```env
GEMINI_API_KEY=your_gemini_api_key_here
```
### 3. Run the Application
Open **4 terminals** and run the following commands:
#### Terminal 1: Custom MCP Server π§
```bash
cd mcp-agentic-ai
python -m custom_mcp.server
```
_Server will start on http://localhost:8000_
#### Terminal 2: Public MCP Server π
```bash
cd mcp-agentic-ai
python -m public_mcp.server_public
```
_Server will start on http://localhost:8001_
#### Terminal 3: Streamlit Dashboard π¨
```bash
cd mcp-agentic-ai/streamlit_demo
streamlit run app.py
```
_Dashboard will open at http://localhost:8501_
#### Terminal 4: Test the APIs π§ͺ
```bash
# Test Custom MCP Server
curl -X POST http://localhost:8000/task \
-H "Content-Type: application/json" \
-d '{"input":"Hello World","tools":["sample_tool"]}'
# Test Public MCP Server
curl -X POST http://localhost:8001/ask \
-H "Content-Type: application/json" \
-d '{"query":"What is artificial intelligence?"}'
```
## π― Features
### π§ Custom MCP Server Features
- **Asynchronous Task Processing** - Create tasks with unique IDs
- **Tool Integration Framework** - Extensible tool system
- **Performance Monitoring** - Real-time statistics tracking
- **Error Handling** - Robust error management and logging
### π Public MCP Server Features
- **Direct AI Queries** - Instant responses from Gemini
- **Simple API** - Easy-to-use REST endpoints
- **Statistics Tracking** - Performance metrics and analytics
- **High Availability** - Designed for concurrent requests
### π¨ Dashboard Features
- **Modern UI Design** - Glassmorphism effects and animations
- **Real-time Updates** - Live statistics and performance metrics
- **Responsive Design** - Mobile-friendly interface
- **Interactive Forms** - Easy server selection and input handling
## π API Documentation
### Custom MCP Server (Port 8000)
#### Create Task
```http
POST /task
Content-Type: application/json
{
"input": "Your task description",
"tools": ["sample_tool"]
}
Response: {"task_id": "uuid-string"}
```
#### Execute Task
```http
POST /task/{task_id}/run
Response: {
"task_id": "uuid-string",
"output": "AI generated response"
}
```
#### Get Statistics
```http
GET /stats
Response: {
"queries_processed": 42,
"response_time": 1.23,
"success_rate": 95.5,
"uptime": 120.5
}
```
### Public MCP Server (Port 8001)
#### Ask Question
```http
POST /ask
Content-Type: application/json
{
"query": "Your question here"
}
Response: {"response": "AI generated answer"}
```
#### Get Statistics
```http
GET /stats
Response: {
"queries_processed": 15,
"response_time": 0.89,
"success_rate": 100.0,
"todays_queries": 15
}
```
## π οΈ Project Structure
```
mcp_server_project/
βββ π README.md # This file
βββ π requirements.txt # Python dependencies
βββ π .env # Environment variables
β
βββ π mcp-agentic-ai/ # Main application
β βββ π custom_mcp/ # Custom MCP server
β β βββ π server.py # Flask server (Port 8000)
β β βββ π mcp_controller.py # Business logic
β β βββ π tools/ # Custom tools
β β βββ π sample_tool.py # Example tool
β β
β βββ π public_mcp/ # Public MCP server
β β βββ π server_public.py # Flask server (Port 8001)
β β βββ π agent_config.yaml # AI configuration
β β
β βββ π streamlit_demo/ # Interactive dashboard
β βββ π app.py # Streamlit app (Port 8501)
β
βββ π documentation/ # Comprehensive docs
βββ π documentation.md # Main documentation
βββ π workflows.md # Mermaid workflows
βββ π designs.md # Architecture diagrams
βββ π tech-stack.md # Technology details
```
## π§ Development
### Adding Custom Tools
1. Create a new tool file in `mcp-agentic-ai/custom_mcp/tools/`:
```python
# my_custom_tool.py
import logging
def my_custom_tool(text: str) -> str:
"""
Your custom tool implementation
"""
logging.info(f"Processing: {text}")
# Your logic here
result = text.upper() # Example transformation
return result
```
2. Import and use in `mcp_controller.py`:
```python
from custom_mcp.tools.my_custom_tool import my_custom_tool
# Add to the run method
if "my_custom_tool" in task["tools"]:
text = my_custom_tool(text)
```
### Extending the Dashboard
The Streamlit dashboard can be customized by modifying `streamlit_demo/app.py`:
- Add new UI components
- Implement additional statistics
- Create new visualizations
- Add export functionality
## π Documentation
Comprehensive documentation is available in the `documentation/` folder:
- **π [Main Documentation](documentation/documentation.md)** - Complete project guide (1500+ lines)
- **π [Workflows](documentation/workflows.md)** - Mermaid workflow diagrams
- **π [Architecture](documentation/designs.md)** - System design diagrams
- **π [Tech Stack](documentation/tech-stack.md)** - Technology details
## π Learning Outcomes
By completing this project, you'll learn:
- **π€ AI Integration** - Google Gemini API, prompt engineering
- **π§ Backend Development** - Flask, REST APIs, microservices
- **π¨ Frontend Development** - Streamlit, modern CSS, responsive design
- **π System Monitoring** - Real-time statistics, performance tracking
- **ποΈ Architecture Design** - Microservices, event-driven patterns
- **π Security Practices** - API security, environment management
## π Deployment
### Local Development
Follow the [Quick Start](#quick-start) guide above.
### Production Deployment
For production deployment, consider:
- **π³ Docker** - Containerize each service
- **βΈοΈ Kubernetes** - Orchestrate containers
- **π HTTPS** - SSL/TLS certificates
- **π Monitoring** - Prometheus, Grafana
- **ποΈ Database** - PostgreSQL, Redis
## π€ Contributing
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
## π License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## π Support
- **π Documentation** - Check the comprehensive docs in `/documentation/`
- **π Issues** - Report bugs via GitHub Issues
- **π¬ Discussions** - Join GitHub Discussions for questions
- **π§ Contact** - Reach out for additional support
## π Acknowledgments
- **Google Gemini** - For providing excellent AI capabilities
- **Streamlit** - For the amazing dashboard framework
- **Flask** - For the robust web framework
- **Python Community** - For the incredible ecosystem
---
## π― Next Steps
1. **π Run the Application** - Follow the Quick Start guide
2. **π Read Documentation** - Explore the comprehensive docs
3. **π§ Customize Tools** - Add your own custom tools
4. **π¨ Enhance UI** - Improve the dashboard design
5. **π Add Features** - Implement new functionality
6. **π Deploy** - Take it to production
**Ready to build the future of AI? Let's get started! π**
---
_Built with β€οΈ for the AI community. Star β this repo if you find it helpful!_