Skip to main content
Glama

MCP Agentic AI Server

by itsDurvank

πŸš€ MCP Agentic AI Server Project

Python Flask Streamlit Gemini License

A comprehensive Model Context Protocol (MCP) implementation featuring dual AI server architecture, real-time monitoring, and an interactive dashboard.

🌟 Project Overview

This project demonstrates a production-ready MCP (Model Context Protocol) Agentic AI Server system with:

  • πŸ”§ Custom MCP Server - Task-based AI processing with tool integration

  • 🌐 Public MCP Server - Direct AI query processing

  • 🎨 Interactive Dashboard - Real-time monitoring and user interface

  • πŸ“Š Live Statistics - Performance metrics and analytics

  • πŸ› οΈ Extensible Tools - Modular tool framework for custom functionality

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ 🎨 Streamlit Dashboard β”‚ β”‚ (Port 8501) β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β–Ό β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ πŸ”§ Custom MCP β”‚ β”‚ 🌐 Public MCP β”‚ β”‚ (Port 8000) β”‚ β”‚ (Port 8001) β”‚ β”‚ β”‚ β”‚ β”‚ β”‚ β€’ Task Creation β”‚ β”‚ β€’ Direct Queries β”‚ β”‚ β€’ Tool Integrationβ”‚ β”‚ β€’ Simple AI Chat β”‚ β”‚ β€’ Async Processingβ”‚ β”‚ β€’ Real-time Statsβ”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ 🧠 Google β”‚ β”‚ Gemini API β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸš€ Quick Start

Prerequisites

  • Python 3.12+ (Conda environment recommended)

  • Google Gemini API Key (Get one here)

  • Git for cloning the repository

1. Clone & Setup

# Clone the repository git clone <repository-url> cd mcp_server_project # Create and activate virtual environment (recommended) conda create -n mcp_env python=3.12 conda activate mcp_env # Install dependencies pip install -r requirements.txt

2. Environment Configuration

Create a .env file in the project root:

GEMINI_API_KEY=your_gemini_api_key_here

3. Run the Application

Open 4 terminals and run the following commands:

Terminal 1: Custom MCP Server πŸ”§

cd mcp-agentic-ai python -m custom_mcp.server

Server will start on

Terminal 2: Public MCP Server 🌐

cd mcp-agentic-ai python -m public_mcp.server_public

Server will start on

Terminal 3: Streamlit Dashboard 🎨

cd mcp-agentic-ai/streamlit_demo streamlit run app.py

Dashboard will open at

Terminal 4: Test the APIs πŸ§ͺ

# Test Custom MCP Server curl -X POST http://localhost:8000/task \ -H "Content-Type: application/json" \ -d '{"input":"Hello World","tools":["sample_tool"]}' # Test Public MCP Server curl -X POST http://localhost:8001/ask \ -H "Content-Type: application/json" \ -d '{"query":"What is artificial intelligence?"}'

🎯 Features

πŸ”§ Custom MCP Server Features

  • Asynchronous Task Processing - Create tasks with unique IDs

  • Tool Integration Framework - Extensible tool system

  • Performance Monitoring - Real-time statistics tracking

  • Error Handling - Robust error management and logging

🌐 Public MCP Server Features

  • Direct AI Queries - Instant responses from Gemini

  • Simple API - Easy-to-use REST endpoints

  • Statistics Tracking - Performance metrics and analytics

  • High Availability - Designed for concurrent requests

🎨 Dashboard Features

  • Modern UI Design - Glassmorphism effects and animations

  • Real-time Updates - Live statistics and performance metrics

  • Responsive Design - Mobile-friendly interface

  • Interactive Forms - Easy server selection and input handling

πŸ“Š API Documentation

Custom MCP Server (Port 8000)

Create Task

POST /task Content-Type: application/json { "input": "Your task description", "tools": ["sample_tool"] } Response: {"task_id": "uuid-string"}

Execute Task

POST /task/{task_id}/run Response: { "task_id": "uuid-string", "output": "AI generated response" }

Get Statistics

GET /stats Response: { "queries_processed": 42, "response_time": 1.23, "success_rate": 95.5, "uptime": 120.5 }

Public MCP Server (Port 8001)

Ask Question

POST /ask Content-Type: application/json { "query": "Your question here" } Response: {"response": "AI generated answer"}

Get Statistics

GET /stats Response: { "queries_processed": 15, "response_time": 0.89, "success_rate": 100.0, "todays_queries": 15 }

πŸ› οΈ Project Structure

mcp_server_project/ β”œβ”€β”€ πŸ“„ README.md # This file β”œβ”€β”€ πŸ“„ requirements.txt # Python dependencies β”œβ”€β”€ πŸ“„ .env # Environment variables β”‚ β”œβ”€β”€ πŸ“ mcp-agentic-ai/ # Main application β”‚ β”œβ”€β”€ πŸ“ custom_mcp/ # Custom MCP server β”‚ β”‚ β”œβ”€β”€ πŸ“„ server.py # Flask server (Port 8000) β”‚ β”‚ β”œβ”€β”€ πŸ“„ mcp_controller.py # Business logic β”‚ β”‚ └── πŸ“ tools/ # Custom tools β”‚ β”‚ └── πŸ“„ sample_tool.py # Example tool β”‚ β”‚ β”‚ β”œβ”€β”€ πŸ“ public_mcp/ # Public MCP server β”‚ β”‚ β”œβ”€β”€ πŸ“„ server_public.py # Flask server (Port 8001) β”‚ β”‚ └── πŸ“„ agent_config.yaml # AI configuration β”‚ β”‚ β”‚ └── πŸ“ streamlit_demo/ # Interactive dashboard β”‚ └── πŸ“„ app.py # Streamlit app (Port 8501) β”‚ └── πŸ“ documentation/ # Comprehensive docs β”œβ”€β”€ πŸ“„ documentation.md # Main documentation β”œβ”€β”€ πŸ“„ workflows.md # Mermaid workflows β”œβ”€β”€ πŸ“„ designs.md # Architecture diagrams └── πŸ“„ tech-stack.md # Technology details

πŸ”§ Development

Adding Custom Tools

  1. Create a new tool file in mcp-agentic-ai/custom_mcp/tools/:

# my_custom_tool.py import logging def my_custom_tool(text: str) -> str: """ Your custom tool implementation """ logging.info(f"Processing: {text}") # Your logic here result = text.upper() # Example transformation return result
  1. Import and use in mcp_controller.py:

from custom_mcp.tools.my_custom_tool import my_custom_tool # Add to the run method if "my_custom_tool" in task["tools"]: text = my_custom_tool(text)

Extending the Dashboard

The Streamlit dashboard can be customized by modifying streamlit_demo/app.py:

  • Add new UI components

  • Implement additional statistics

  • Create new visualizations

  • Add export functionality

πŸ“š Documentation

Comprehensive documentation is available in the documentation/ folder:

  • πŸ“„ - Complete project guide (1500+ lines)

  • πŸ“„ - Mermaid workflow diagrams

  • πŸ“„ - System design diagrams

  • πŸ“„ - Technology details

πŸŽ“ Learning Outcomes

By completing this project, you'll learn:

  • πŸ€– AI Integration - Google Gemini API, prompt engineering

  • πŸ”§ Backend Development - Flask, REST APIs, microservices

  • 🎨 Frontend Development - Streamlit, modern CSS, responsive design

  • πŸ“Š System Monitoring - Real-time statistics, performance tracking

  • πŸ—οΈ Architecture Design - Microservices, event-driven patterns

  • πŸ” Security Practices - API security, environment management

πŸš€ Deployment

Local Development

Follow the Quick Start guide above.

Production Deployment

For production deployment, consider:

  • 🐳 Docker - Containerize each service

  • ☸️ Kubernetes - Orchestrate containers

  • πŸ”’ HTTPS - SSL/TLS certificates

  • πŸ“Š Monitoring - Prometheus, Grafana

  • πŸ—„οΈ Database - PostgreSQL, Redis

🀝 Contributing

  1. Fork the repository

  2. Create a feature branch (git checkout -b feature/amazing-feature)

  3. Commit your changes (git commit -m 'Add amazing feature')

  4. Push to the branch (git push origin feature/amazing-feature)

  5. Open a Pull Request

πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ†˜ Support

  • πŸ“š Documentation - Check the comprehensive docs in /documentation/

  • πŸ› Issues - Report bugs via GitHub Issues

  • πŸ’¬ Discussions - Join GitHub Discussions for questions

  • πŸ“§ Contact - Reach out for additional support

🌟 Acknowledgments

  • Google Gemini - For providing excellent AI capabilities

  • Streamlit - For the amazing dashboard framework

  • Flask - For the robust web framework

  • Python Community - For the incredible ecosystem


🎯 Next Steps

  1. πŸš€ Run the Application - Follow the Quick Start guide

  2. πŸ“š Read Documentation - Explore the comprehensive docs

  3. πŸ”§ Customize Tools - Add your own custom tools

  4. 🎨 Enhance UI - Improve the dashboard design

  5. πŸ“Š Add Features - Implement new functionality

  6. πŸš€ Deploy - Take it to production

Ready to build the future of AI? Let's get started! πŸš€


Built with ❀️ for the AI community. Star ⭐ this repo if you find it helpful!

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itsDurvank/Mcp_server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server