Enables AI-powered task processing and direct query responses through Google's Gemini API, supporting both structured task workflows with tool integration and simple conversational AI interactions.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Agentic AI Serveranalyze customer feedback from last week and summarize key themes"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
π MCP Agentic AI Server Project
A comprehensive Model Context Protocol (MCP) implementation featuring dual AI server architecture, real-time monitoring, and an interactive dashboard.
π Project Overview
This project demonstrates a production-ready MCP (Model Context Protocol) Agentic AI Server system with:
π§ Custom MCP Server - Task-based AI processing with tool integration
π Public MCP Server - Direct AI query processing
π¨ Interactive Dashboard - Real-time monitoring and user interface
π Live Statistics - Performance metrics and analytics
π οΈ Extensible Tools - Modular tool framework for custom functionality
ποΈ Architecture
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β π¨ Streamlit Dashboard β
β (Port 8501) β
βββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββ΄ββββββββββββββββ
βΌ βΌ
βββββββββββββββββββββ ββββββββββββββββββββ
β π§ Custom MCP β β π Public MCP β
β (Port 8000) β β (Port 8001) β
β β β β
β β’ Task Creation β β β’ Direct Queries β
β β’ Tool Integrationβ β β’ Simple AI Chat β
β β’ Async Processingβ β β’ Real-time Statsβ
βββββββββββββββββββββ ββββββββββββββββββββ
β β
βββββββββββββββ¬ββββββββββββββ
βΌ
βββββββββββββββββββ
β π§ Google β
β Gemini API β
βββββββββββββββββββπ Quick Start
Prerequisites
Python 3.12+ (Conda environment recommended)
Google Gemini API Key (Get one here)
Git for cloning the repository
1. Clone & Setup
# Clone the repository
git clone <repository-url>
cd mcp_server_project
# Create and activate virtual environment (recommended)
conda create -n mcp_env python=3.12
conda activate mcp_env
# Install dependencies
pip install -r requirements.txt2. Environment Configuration
Create a .env file in the project root:
GEMINI_API_KEY=your_gemini_api_key_here3. Run the Application
Open 4 terminals and run the following commands:
Terminal 1: Custom MCP Server π§
cd mcp-agentic-ai
python -m custom_mcp.serverServer will start on http://localhost:8000
Terminal 2: Public MCP Server π
cd mcp-agentic-ai
python -m public_mcp.server_publicServer will start on http://localhost:8001
Terminal 3: Streamlit Dashboard π¨
cd mcp-agentic-ai/streamlit_demo
streamlit run app.pyDashboard will open at http://localhost:8501
Terminal 4: Test the APIs π§ͺ
# Test Custom MCP Server
curl -X POST http://localhost:8000/task \
-H "Content-Type: application/json" \
-d '{"input":"Hello World","tools":["sample_tool"]}'
# Test Public MCP Server
curl -X POST http://localhost:8001/ask \
-H "Content-Type: application/json" \
-d '{"query":"What is artificial intelligence?"}'π― Features
π§ Custom MCP Server Features
Asynchronous Task Processing - Create tasks with unique IDs
Tool Integration Framework - Extensible tool system
Performance Monitoring - Real-time statistics tracking
Error Handling - Robust error management and logging
π Public MCP Server Features
Direct AI Queries - Instant responses from Gemini
Simple API - Easy-to-use REST endpoints
Statistics Tracking - Performance metrics and analytics
High Availability - Designed for concurrent requests
π¨ Dashboard Features
Modern UI Design - Glassmorphism effects and animations
Real-time Updates - Live statistics and performance metrics
Responsive Design - Mobile-friendly interface
Interactive Forms - Easy server selection and input handling
π API Documentation
Custom MCP Server (Port 8000)
Create Task
POST /task
Content-Type: application/json
{
"input": "Your task description",
"tools": ["sample_tool"]
}
Response: {"task_id": "uuid-string"}Execute Task
POST /task/{task_id}/run
Response: {
"task_id": "uuid-string",
"output": "AI generated response"
}Get Statistics
GET /stats
Response: {
"queries_processed": 42,
"response_time": 1.23,
"success_rate": 95.5,
"uptime": 120.5
}Public MCP Server (Port 8001)
Ask Question
POST /ask
Content-Type: application/json
{
"query": "Your question here"
}
Response: {"response": "AI generated answer"}Get Statistics
GET /stats
Response: {
"queries_processed": 15,
"response_time": 0.89,
"success_rate": 100.0,
"todays_queries": 15
}π οΈ Project Structure
mcp_server_project/
βββ π README.md # This file
βββ π requirements.txt # Python dependencies
βββ π .env # Environment variables
β
βββ π mcp-agentic-ai/ # Main application
β βββ π custom_mcp/ # Custom MCP server
β β βββ π server.py # Flask server (Port 8000)
β β βββ π mcp_controller.py # Business logic
β β βββ π tools/ # Custom tools
β β βββ π sample_tool.py # Example tool
β β
β βββ π public_mcp/ # Public MCP server
β β βββ π server_public.py # Flask server (Port 8001)
β β βββ π agent_config.yaml # AI configuration
β β
β βββ π streamlit_demo/ # Interactive dashboard
β βββ π app.py # Streamlit app (Port 8501)
β
βββ π documentation/ # Comprehensive docs
βββ π documentation.md # Main documentation
βββ π workflows.md # Mermaid workflows
βββ π designs.md # Architecture diagrams
βββ π tech-stack.md # Technology detailsπ§ Development
Adding Custom Tools
Create a new tool file in
mcp-agentic-ai/custom_mcp/tools/:
# my_custom_tool.py
import logging
def my_custom_tool(text: str) -> str:
"""
Your custom tool implementation
"""
logging.info(f"Processing: {text}")
# Your logic here
result = text.upper() # Example transformation
return resultImport and use in
mcp_controller.py:
from custom_mcp.tools.my_custom_tool import my_custom_tool
# Add to the run method
if "my_custom_tool" in task["tools"]:
text = my_custom_tool(text)Extending the Dashboard
The Streamlit dashboard can be customized by modifying streamlit_demo/app.py:
Add new UI components
Implement additional statistics
Create new visualizations
Add export functionality
π Documentation
Comprehensive documentation is available in the documentation/ folder:
π Main Documentation - Complete project guide (1500+ lines)
π Workflows - Mermaid workflow diagrams
π Architecture - System design diagrams
π Tech Stack - Technology details
π Learning Outcomes
By completing this project, you'll learn:
π€ AI Integration - Google Gemini API, prompt engineering
π§ Backend Development - Flask, REST APIs, microservices
π¨ Frontend Development - Streamlit, modern CSS, responsive design
π System Monitoring - Real-time statistics, performance tracking
ποΈ Architecture Design - Microservices, event-driven patterns
π Security Practices - API security, environment management
π Deployment
Local Development
Follow the Quick Start guide above.
Production Deployment
For production deployment, consider:
π³ Docker - Containerize each service
βΈοΈ Kubernetes - Orchestrate containers
π HTTPS - SSL/TLS certificates
π Monitoring - Prometheus, Grafana
ποΈ Database - PostgreSQL, Redis
π€ Contributing
Fork the repository
Create a feature branch (
git checkout -b feature/amazing-feature)Commit your changes (
git commit -m 'Add amazing feature')Push to the branch (
git push origin feature/amazing-feature)Open a Pull Request
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Support
π Documentation - Check the comprehensive docs in
/documentation/π Issues - Report bugs via GitHub Issues
π¬ Discussions - Join GitHub Discussions for questions
π§ Contact - Reach out for additional support
π Acknowledgments
Google Gemini - For providing excellent AI capabilities
Streamlit - For the amazing dashboard framework
Flask - For the robust web framework
Python Community - For the incredible ecosystem
π― Next Steps
π Run the Application - Follow the Quick Start guide
π Read Documentation - Explore the comprehensive docs
π§ Customize Tools - Add your own custom tools
π¨ Enhance UI - Improve the dashboard design
π Add Features - Implement new functionality
π Deploy - Take it to production
Ready to build the future of AI? Let's get started! π
Built with β€οΈ for the AI community. Star β this repo if you find it helpful!
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.