Skip to main content
Glama

Microservice Control Panel (MCP)

by Chunkys0up7

Model Context Protocol (MCP)

Overview

MCP is a modular framework for managing, executing, and monitoring AI model contexts, including LLM prompts, Jupyter notebooks, and Python scripts. It provides a FastAPI backend and a modern React (MUI/ReactFlow) dashboard frontend (see frontend/).

Features

  • Register and manage different types of MCPs (LLM prompts, notebooks, scripts)
  • Execute MCPs and view results in a web UI
  • Monitor server health and statistics
  • Extensible for new MCP types
  • High-performance database operations with connection pooling
  • Query caching with Redis
  • PostgreSQL index optimization
  • System monitoring and metrics collection
  • AI Co-Pilot for workflow optimization
  • Dependency visualization and analysis

Setup

Prerequisites

  • Python 3.9+
  • PostgreSQL 12+
  • Redis 6+
  • (Recommended) Create and activate a virtual environment

Install dependencies

pip install -r requirements.txt

Environment Variables

  • Set MCP_API_KEY for API authentication (optional, defaults provided)
  • For LLMs, set ANTHROPIC_API_KEY if using Claude
  • Set DATABASE_URL for PostgreSQL connection
  • Set REDIS_URL for Redis connection

Start the backend

uvicorn mcp.api.main:app --reload

Start the frontend

cd frontend npm install npm run dev

Usage

  • Access the dashboard at http://localhost:5173 (default React dev server)
  • Create, manage, and test MCPs from the UI
  • Monitor health and stats from the sidebar
  • Use the AI Co-Pilot for workflow optimization
  • Visualize component dependencies
  • Monitor system performance

Components

Database Management

  • Connection pooling for efficient database access
  • Query caching with Redis for improved performance
  • PostgreSQL index optimization for faster queries
  • Database monitoring and statistics

System Monitoring

  • Real-time system health monitoring
  • Performance metrics collection
  • Alerting system with severity levels
  • Prometheus metrics integration

AI Co-Pilot

  • Workflow optimization suggestions
  • Error resolution assistance
  • Best practice recommendations
  • Performance improvements

Dependency Visualizer

  • Component relationship visualization
  • Dependency conflict detection
  • Version compatibility checking
  • Visual dependency mapping

Adding New MCPs

  • Implement a new MCP class in mcp/core/
  • Register it in the backend
  • Add UI support in mcp/ui/app.py

Running Tests

pytest

Project Structure

  • mcp/api/ - FastAPI backend
  • frontend/ - React (MUI/ReactFlow) frontend (primary UI)
  • mcp/core/ - Core MCP types and logic
  • mcp/db/ - Database management and optimization
  • mcp/monitoring/ - System monitoring and metrics
  • mcp/components/ - AI Co-Pilot and Dependency Visualizer
  • tests/ - Test suite

License

MIT

API Documentation

Once the server is running, you can access:

  • API documentation: http://localhost:8000/docs
  • Prometheus metrics: http://localhost:8000/metrics
  • Health check: http://localhost:8000/health
  • Statistics: http://localhost:8000/stats

Security

  • API key authentication is required for all endpoints
  • Rate limiting is enabled by default
  • CORS is configured to allow only specific origins
  • All sensitive configuration is managed through environment variables
  • Database connection pooling with health checks
  • Redis connection security

Monitoring

The server includes:

  • Prometheus metrics for request counts, latencies, and server executions
  • Structured JSON logging
  • Health check endpoint
  • Server statistics endpoint
  • System resource monitoring
  • Database performance metrics
  • Cache statistics
  • Alert management

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

Additional Dependencies

This project requires the following additional Python packages:

Core Dependencies

  • pandas
  • numpy
  • matplotlib
  • papermill
  • nbformat
  • jupyter
  • anthropic

Database Dependencies

  • sqlalchemy
  • psycopg2-binary
  • redis
  • alembic

Monitoring Dependencies

  • prometheus-client
  • psutil
  • networkx
  • graphviz

Install all dependencies with:

pip install -r requirements.txt

Using the Notebook MCP to Call an LLM (Claude)

The example notebook (mcp/notebooks/example.ipynb) demonstrates:

  • Data analysis and plotting
  • Calling the Claude LLM via the anthropic Python package

To use the LLM cell, ensure you have set your ANTHROPIC_API_KEY in your environment or .env file.

The notebook cell for LLM looks like this:

import os import anthropic api_key = os.getenv('ANTHROPIC_API_KEY') if not api_key: raise ValueError('ANTHROPIC_API_KEY not set in environment!') client = anthropic.Anthropic(api_key=api_key) response = client.messages.create( model='claude-3-sonnet-20240229', max_tokens=256, temperature=0.7, messages=[ {'role': 'user', 'content': 'Tell me a joke about data science.'} ] ) print('Claude says:', response.content[0].text)

API Key Management and Authentication

Creating and Managing API Keys

  • Use the /api/apikeys/ endpoint to create a new API key (admin or self-service).
  • List your API keys with GET /api/apikeys/.
  • Revoke an API key with POST /api/apikeys/revoke.

Authenticating with API Keys

  • Pass your API key in the X-API-KEY header for any authenticated endpoint.
  • Alternatively, you can use a Bearer JWT in the Authorization header.
  • All endpoints that require authentication now support both methods.

Example:

GET /api/apikeys/ X-API-KEY: <your-api-key>

Execution Monitor Enhancements (UI)

The Execution Monitor now includes the following panels (with mock data, ready for backend integration):

  • Resource Usage Panel: View CPU and memory usage per workflow step.
  • Time-Travel Debugging Panel: Select a step to view its state at execution time.
  • Performance Suggestions Panel: See optimization and bottleneck suggestions.
  • Real-Time Metrics Dashboard: Monitor live metrics (CPU, memory, throughput, etc.).

These panels are integrated into the workflow Execution Monitor and will display real data once backend support is available.

Deprecation Notice

The Streamlit UI (mcp/ui/) is deprecated and will be removed in a future release. Please use the React frontend in frontend/ for all new development and usage.

UI Notice

The React frontend in frontend/ is now the only supported UI. The previous Streamlit UI has been removed. Please use the React app for all development and usage.

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

LLM 상호작용, Jupyter 노트북 실행, 시각적 워크플로 기능을 갖춘 마이크로서비스를 통해 AI 애플리케이션을 구축하고 조율하기 위한 모듈식 시스템입니다.

  1. 개요
    1. 특징
      1. 설정
        1. 필수 조건
        2. 종속성 설치
        3. 환경 변수
        4. 백엔드 시작
        5. 프런트엔드 시작
      2. 용법
        1. 새로운 MCP 추가
          1. 테스트 실행
            1. 프로젝트 구조
              1. 특허
                1. API 문서
                  1. 보안
                    1. 모니터링
                      1. 기여하다
                        1. 노트북 및 LLM 통합을 위한 추가 종속성
                          1. Notebook MCP를 사용하여 LLM 호출(Claude)

                            Related MCP Servers

                            • -
                              security
                              F
                              license
                              -
                              quality
                              Enables AI agent and task management using the CrewAI framework, allowing users to create and run agents and tasks in an automated workflow environment.
                              Last updated -
                              0
                              3
                            • A
                              security
                              A
                              license
                              A
                              quality
                              A lightweight, modular API service that provides useful tools like weather, date/time, calculator, search, email, and task management through a RESTful interface, designed for integration with AI agents and automated workflows.
                              Last updated -
                              5
                              MIT License
                            • A
                              security
                              A
                              license
                              A
                              quality
                              An MCP server that enables AI agents to interact with Modal, allowing them to deploy apps and run functions in a serverless cloud environment.
                              Last updated -
                              7
                              3
                              MIT License
                            • -
                              security
                              F
                              license
                              -
                              quality
                              A lightweight Python-based server designed to run, manage and create CrewAI workflows using the Model Context Protocol for communicating with LLMs and tools like Claude Desktop or Cursor IDE.
                              Last updated -
                              32

                            View all related MCP servers

                            MCP directory API

                            We provide all the information about MCP servers via our MCP API.

                            curl -X GET 'https://glama.ai/api/mcp/v1/servers/Chunkys0up7/MCP'

                            If you have feedback or need assistance with the MCP directory API, please join our Discord server