Skip to main content
Glama
README.mdβ€’11 kB
# πŸš€ MCP Agentic AI Server Project [![Python](https://img.shields.io/badge/Python-3.12+-blue.svg)](https://www.python.org/downloads/) [![Flask](https://img.shields.io/badge/Flask-2.0+-green.svg)](https://flask.palletsprojects.com/) [![Streamlit](https://img.shields.io/badge/Streamlit-1.24+-red.svg)](https://streamlit.io/) [![Gemini](https://img.shields.io/badge/Google-Gemini_API-yellow.svg)](https://ai.google.dev/) [![License](https://img.shields.io/badge/License-MIT-purple.svg)](LICENSE) > **A comprehensive Model Context Protocol (MCP) implementation featuring dual AI server architecture, real-time monitoring, and an interactive dashboard.** ## 🌟 Project Overview This project demonstrates a production-ready **MCP (Model Context Protocol) Agentic AI Server** system with: - πŸ”§ **Custom MCP Server** - Task-based AI processing with tool integration - 🌐 **Public MCP Server** - Direct AI query processing - 🎨 **Interactive Dashboard** - Real-time monitoring and user interface - πŸ“Š **Live Statistics** - Performance metrics and analytics - πŸ› οΈ **Extensible Tools** - Modular tool framework for custom functionality ## πŸ—οΈ Architecture ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ 🎨 Streamlit Dashboard β”‚ β”‚ (Port 8501) β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β–Ό β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ πŸ”§ Custom MCP β”‚ β”‚ 🌐 Public MCP β”‚ β”‚ (Port 8000) β”‚ β”‚ (Port 8001) β”‚ β”‚ β”‚ β”‚ β”‚ β”‚ β€’ Task Creation β”‚ β”‚ β€’ Direct Queries β”‚ β”‚ β€’ Tool Integrationβ”‚ β”‚ β€’ Simple AI Chat β”‚ β”‚ β€’ Async Processingβ”‚ β”‚ β€’ Real-time Statsβ”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ 🧠 Google β”‚ β”‚ Gemini API β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ## πŸš€ Quick Start ### Prerequisites - **Python 3.12+** (Conda environment recommended) - **Google Gemini API Key** ([Get one here](https://ai.google.dev/)) - **Git** for cloning the repository ### 1. Clone & Setup ```bash # Clone the repository git clone <repository-url> cd mcp_server_project # Create and activate virtual environment (recommended) conda create -n mcp_env python=3.12 conda activate mcp_env # Install dependencies pip install -r requirements.txt ``` ### 2. Environment Configuration Create a `.env` file in the project root: ```env GEMINI_API_KEY=your_gemini_api_key_here ``` ### 3. Run the Application Open **4 terminals** and run the following commands: #### Terminal 1: Custom MCP Server πŸ”§ ```bash cd mcp-agentic-ai python -m custom_mcp.server ``` _Server will start on http://localhost:8000_ #### Terminal 2: Public MCP Server 🌐 ```bash cd mcp-agentic-ai python -m public_mcp.server_public ``` _Server will start on http://localhost:8001_ #### Terminal 3: Streamlit Dashboard 🎨 ```bash cd mcp-agentic-ai/streamlit_demo streamlit run app.py ``` _Dashboard will open at http://localhost:8501_ #### Terminal 4: Test the APIs πŸ§ͺ ```bash # Test Custom MCP Server curl -X POST http://localhost:8000/task \ -H "Content-Type: application/json" \ -d '{"input":"Hello World","tools":["sample_tool"]}' # Test Public MCP Server curl -X POST http://localhost:8001/ask \ -H "Content-Type: application/json" \ -d '{"query":"What is artificial intelligence?"}' ``` ## 🎯 Features ### πŸ”§ Custom MCP Server Features - **Asynchronous Task Processing** - Create tasks with unique IDs - **Tool Integration Framework** - Extensible tool system - **Performance Monitoring** - Real-time statistics tracking - **Error Handling** - Robust error management and logging ### 🌐 Public MCP Server Features - **Direct AI Queries** - Instant responses from Gemini - **Simple API** - Easy-to-use REST endpoints - **Statistics Tracking** - Performance metrics and analytics - **High Availability** - Designed for concurrent requests ### 🎨 Dashboard Features - **Modern UI Design** - Glassmorphism effects and animations - **Real-time Updates** - Live statistics and performance metrics - **Responsive Design** - Mobile-friendly interface - **Interactive Forms** - Easy server selection and input handling ## πŸ“Š API Documentation ### Custom MCP Server (Port 8000) #### Create Task ```http POST /task Content-Type: application/json { "input": "Your task description", "tools": ["sample_tool"] } Response: {"task_id": "uuid-string"} ``` #### Execute Task ```http POST /task/{task_id}/run Response: { "task_id": "uuid-string", "output": "AI generated response" } ``` #### Get Statistics ```http GET /stats Response: { "queries_processed": 42, "response_time": 1.23, "success_rate": 95.5, "uptime": 120.5 } ``` ### Public MCP Server (Port 8001) #### Ask Question ```http POST /ask Content-Type: application/json { "query": "Your question here" } Response: {"response": "AI generated answer"} ``` #### Get Statistics ```http GET /stats Response: { "queries_processed": 15, "response_time": 0.89, "success_rate": 100.0, "todays_queries": 15 } ``` ## πŸ› οΈ Project Structure ``` mcp_server_project/ β”œβ”€β”€ πŸ“„ README.md # This file β”œβ”€β”€ πŸ“„ requirements.txt # Python dependencies β”œβ”€β”€ πŸ“„ .env # Environment variables β”‚ β”œβ”€β”€ πŸ“ mcp-agentic-ai/ # Main application β”‚ β”œβ”€β”€ πŸ“ custom_mcp/ # Custom MCP server β”‚ β”‚ β”œβ”€β”€ πŸ“„ server.py # Flask server (Port 8000) β”‚ β”‚ β”œβ”€β”€ πŸ“„ mcp_controller.py # Business logic β”‚ β”‚ └── πŸ“ tools/ # Custom tools β”‚ β”‚ └── πŸ“„ sample_tool.py # Example tool β”‚ β”‚ β”‚ β”œβ”€β”€ πŸ“ public_mcp/ # Public MCP server β”‚ β”‚ β”œβ”€β”€ πŸ“„ server_public.py # Flask server (Port 8001) β”‚ β”‚ └── πŸ“„ agent_config.yaml # AI configuration β”‚ β”‚ β”‚ └── πŸ“ streamlit_demo/ # Interactive dashboard β”‚ └── πŸ“„ app.py # Streamlit app (Port 8501) β”‚ └── πŸ“ documentation/ # Comprehensive docs β”œβ”€β”€ πŸ“„ documentation.md # Main documentation β”œβ”€β”€ πŸ“„ workflows.md # Mermaid workflows β”œβ”€β”€ πŸ“„ designs.md # Architecture diagrams └── πŸ“„ tech-stack.md # Technology details ``` ## πŸ”§ Development ### Adding Custom Tools 1. Create a new tool file in `mcp-agentic-ai/custom_mcp/tools/`: ```python # my_custom_tool.py import logging def my_custom_tool(text: str) -> str: """ Your custom tool implementation """ logging.info(f"Processing: {text}") # Your logic here result = text.upper() # Example transformation return result ``` 2. Import and use in `mcp_controller.py`: ```python from custom_mcp.tools.my_custom_tool import my_custom_tool # Add to the run method if "my_custom_tool" in task["tools"]: text = my_custom_tool(text) ``` ### Extending the Dashboard The Streamlit dashboard can be customized by modifying `streamlit_demo/app.py`: - Add new UI components - Implement additional statistics - Create new visualizations - Add export functionality ## πŸ“š Documentation Comprehensive documentation is available in the `documentation/` folder: - **πŸ“„ [Main Documentation](documentation/documentation.md)** - Complete project guide (1500+ lines) - **πŸ“„ [Workflows](documentation/workflows.md)** - Mermaid workflow diagrams - **πŸ“„ [Architecture](documentation/designs.md)** - System design diagrams - **πŸ“„ [Tech Stack](documentation/tech-stack.md)** - Technology details ## πŸŽ“ Learning Outcomes By completing this project, you'll learn: - **πŸ€– AI Integration** - Google Gemini API, prompt engineering - **πŸ”§ Backend Development** - Flask, REST APIs, microservices - **🎨 Frontend Development** - Streamlit, modern CSS, responsive design - **πŸ“Š System Monitoring** - Real-time statistics, performance tracking - **πŸ—οΈ Architecture Design** - Microservices, event-driven patterns - **πŸ” Security Practices** - API security, environment management ## πŸš€ Deployment ### Local Development Follow the [Quick Start](#quick-start) guide above. ### Production Deployment For production deployment, consider: - **🐳 Docker** - Containerize each service - **☸️ Kubernetes** - Orchestrate containers - **πŸ”’ HTTPS** - SSL/TLS certificates - **πŸ“Š Monitoring** - Prometheus, Grafana - **πŸ—„οΈ Database** - PostgreSQL, Redis ## 🀝 Contributing 1. Fork the repository 2. Create a feature branch (`git checkout -b feature/amazing-feature`) 3. Commit your changes (`git commit -m 'Add amazing feature'`) 4. Push to the branch (`git push origin feature/amazing-feature`) 5. Open a Pull Request ## πŸ“ License This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. ## πŸ†˜ Support - **πŸ“š Documentation** - Check the comprehensive docs in `/documentation/` - **πŸ› Issues** - Report bugs via GitHub Issues - **πŸ’¬ Discussions** - Join GitHub Discussions for questions - **πŸ“§ Contact** - Reach out for additional support ## 🌟 Acknowledgments - **Google Gemini** - For providing excellent AI capabilities - **Streamlit** - For the amazing dashboard framework - **Flask** - For the robust web framework - **Python Community** - For the incredible ecosystem --- ## 🎯 Next Steps 1. **πŸš€ Run the Application** - Follow the Quick Start guide 2. **πŸ“š Read Documentation** - Explore the comprehensive docs 3. **πŸ”§ Customize Tools** - Add your own custom tools 4. **🎨 Enhance UI** - Improve the dashboard design 5. **πŸ“Š Add Features** - Implement new functionality 6. **πŸš€ Deploy** - Take it to production **Ready to build the future of AI? Let's get started! πŸš€** --- _Built with ❀️ for the AI community. Star ⭐ this repo if you find it helpful!_

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itsDurvank/Mcp_server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server