Enables configuration management through environment variables, allowing customization of the MCP server's behavior
Provides the API backend for the MCP server, allowing HTTP-based interactions with the system's microservices
Supports Jupyter notebook execution within the MCP environment, allowing notebook-based workflows for AI applications
Model Context Protocol (MCP)
Overview
MCP is a modular framework for managing, executing, and monitoring AI model contexts, including LLM prompts, Jupyter notebooks, and Python scripts. It provides a FastAPI backend and a Streamlit dashboard frontend.
Features
- Register and manage different types of MCPs (LLM prompts, notebooks, scripts)
- Execute MCPs and view results in a web UI
- Monitor server health and statistics
- Extensible for new MCP types
Setup
Prerequisites
- Python 3.9+
- (Recommended) Create and activate a virtual environment
Install dependencies
Environment Variables
- Set
MCP_API_KEY
for API authentication (optional, defaults provided) - For LLMs, set
ANTHROPIC_API_KEY
if using Claude
Start the backend
Start the frontend
Usage
- Access the dashboard at http://localhost:8501
- Create, manage, and test MCPs from the UI
- Monitor health and stats from the sidebar
Adding New MCPs
- Implement a new MCP class in
mcp/core/
- Register it in the backend
- Add UI support in
mcp/ui/app.py
Running Tests
Project Structure
mcp/api/
- FastAPI backendmcp/ui/
- Streamlit frontendmcp/core/
- Core MCP types and logictests/
- Test suite
License
MIT
API Documentation
Once the server is running, you can access:
- API documentation: http://localhost:8000/docs
- Prometheus metrics: http://localhost:8000/metrics
- Health check: http://localhost:8000/health
- Statistics: http://localhost:8000/stats
Security
- API key authentication is required for all endpoints
- Rate limiting is enabled by default
- CORS is configured to allow only specific origins
- All sensitive configuration is managed through environment variables
Monitoring
The server includes:
- Prometheus metrics for request counts, latencies, and server executions
- Structured JSON logging
- Health check endpoint
- Server statistics endpoint
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
Additional Dependencies for Notebook and LLM Integration
This project now requires the following additional Python packages:
- pandas
- numpy
- matplotlib
- papermill
- nbformat
- jupyter
- anthropic
Install all dependencies with:
Using the Notebook MCP to Call an LLM (Claude)
The example notebook (mcp/notebooks/example.ipynb
) demonstrates:
- Data analysis and plotting
- Calling the Claude LLM via the
anthropic
Python package
To use the LLM cell, ensure you have set your ANTHROPIC_API_KEY
in your environment or .env
file.
The notebook cell for LLM looks like this:
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A modular system for building and orchestrating AI applications through microservices, featuring LLM interactions, Jupyter notebook execution, and visual workflow capabilities.
Related MCP Servers
- -securityFlicense-qualityEnables AI agent and task management using the CrewAI framework, allowing users to create and run agents and tasks in an automated workflow environment.Last updated -03JavaScript
- -securityAlicense-qualityA lightweight, modular API service that provides useful tools like weather, date/time, calculator, search, email, and task management through a RESTful interface, designed for integration with AI agents and automated workflows.Last updated -PythonMIT License
- -securityAlicense-qualityAn MCP server that enables AI agents to interact with Modal, allowing them to deploy apps and run functions in a serverless cloud environment.Last updated -PythonMIT License
- -securityFlicense-qualityA lightweight Python-based server designed to run, manage and create CrewAI workflows using the Model Context Protocol for communicating with LLMs and tools like Claude Desktop or Cursor IDE.Last updated -1Python