Skip to main content
Glama

Microservice Control Panel (MCP)

by Chunkys0up7
Integrations
  • Enables configuration management through environment variables, allowing customization of the MCP server's behavior

  • Provides the API backend for the MCP server, allowing HTTP-based interactions with the system's microservices

  • Supports Jupyter notebook execution within the MCP environment, allowing notebook-based workflows for AI applications

Model Context Protocol (MCP)

Overview

MCP is a modular framework for managing, executing, and monitoring AI model contexts, including LLM prompts, Jupyter notebooks, and Python scripts. It provides a FastAPI backend and a Streamlit dashboard frontend.

Features

  • Register and manage different types of MCPs (LLM prompts, notebooks, scripts)
  • Execute MCPs and view results in a web UI
  • Monitor server health and statistics
  • Extensible for new MCP types

Setup

Prerequisites

  • Python 3.9+
  • (Recommended) Create and activate a virtual environment

Install dependencies

pip install -r requirements.txt

Environment Variables

  • Set MCP_API_KEY for API authentication (optional, defaults provided)
  • For LLMs, set ANTHROPIC_API_KEY if using Claude

Start the backend

uvicorn mcp.api.main:app --reload

Start the frontend

streamlit run mcp/ui/app.py

Usage

  • Access the dashboard at http://localhost:8501
  • Create, manage, and test MCPs from the UI
  • Monitor health and stats from the sidebar

Adding New MCPs

  • Implement a new MCP class in mcp/core/
  • Register it in the backend
  • Add UI support in mcp/ui/app.py

Running Tests

pytest

Project Structure

  • mcp/api/ - FastAPI backend
  • mcp/ui/ - Streamlit frontend
  • mcp/core/ - Core MCP types and logic
  • tests/ - Test suite

License

MIT

API Documentation

Once the server is running, you can access:

Security

  • API key authentication is required for all endpoints
  • Rate limiting is enabled by default
  • CORS is configured to allow only specific origins
  • All sensitive configuration is managed through environment variables

Monitoring

The server includes:

  • Prometheus metrics for request counts, latencies, and server executions
  • Structured JSON logging
  • Health check endpoint
  • Server statistics endpoint

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

Additional Dependencies for Notebook and LLM Integration

This project now requires the following additional Python packages:

  • pandas
  • numpy
  • matplotlib
  • papermill
  • nbformat
  • jupyter
  • anthropic

Install all dependencies with:

pip install -r requirements.txt

Using the Notebook MCP to Call an LLM (Claude)

The example notebook (mcp/notebooks/example.ipynb) demonstrates:

  • Data analysis and plotting
  • Calling the Claude LLM via the anthropic Python package

To use the LLM cell, ensure you have set your ANTHROPIC_API_KEY in your environment or .env file.

The notebook cell for LLM looks like this:

import os import anthropic api_key = os.getenv('ANTHROPIC_API_KEY') if not api_key: raise ValueError('ANTHROPIC_API_KEY not set in environment!') client = anthropic.Anthropic(api_key=api_key) response = client.messages.create( model='claude-3-sonnet-20240229', max_tokens=256, temperature=0.7, messages=[ {'role': 'user', 'content': 'Tell me a joke about data science.'} ] ) print('Claude says:', response.content[0].text)
-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A modular system for building and orchestrating AI applications through microservices, featuring LLM interactions, Jupyter notebook execution, and visual workflow capabilities.

  1. Overview
    1. Features
      1. Setup
        1. Prerequisites
        2. Install dependencies
        3. Environment Variables
        4. Start the backend
        5. Start the frontend
      2. Usage
        1. Adding New MCPs
          1. Running Tests
            1. Project Structure
              1. License
                1. API Documentation
                  1. Security
                    1. Monitoring
                      1. Contributing
                        1. Additional Dependencies for Notebook and LLM Integration
                          1. Using the Notebook MCP to Call an LLM (Claude)

                            Related MCP Servers

                            • -
                              security
                              F
                              license
                              -
                              quality
                              Enables AI agent and task management using the CrewAI framework, allowing users to create and run agents and tasks in an automated workflow environment.
                              Last updated -
                              0
                              3
                              JavaScript
                            • -
                              security
                              A
                              license
                              -
                              quality
                              A lightweight, modular API service that provides useful tools like weather, date/time, calculator, search, email, and task management through a RESTful interface, designed for integration with AI agents and automated workflows.
                              Last updated -
                              Python
                              MIT License
                            • -
                              security
                              A
                              license
                              -
                              quality
                              An MCP server that enables AI agents to interact with Modal, allowing them to deploy apps and run functions in a serverless cloud environment.
                              Last updated -
                              Python
                              MIT License
                            • -
                              security
                              F
                              license
                              -
                              quality
                              A lightweight Python-based server designed to run, manage and create CrewAI workflows using the Model Context Protocol for communicating with LLMs and tools like Claude Desktop or Cursor IDE.
                              Last updated -
                              1
                              Python

                            View all related MCP servers

                            MCP directory API

                            We provide all the information about MCP servers via our MCP API.

                            curl -X GET 'https://glama.ai/api/mcp/v1/servers/Chunkys0up7/MCP'

                            If you have feedback or need assistance with the MCP directory API, please join our Discord server