Integrates with Docker for running Prometheus, Node Exporter, cAdvisor, and memory stress testing
Uses FastAPI as the web framework for implementing the MCP server API endpoints
Provides Git repository analysis, searching, and diff functionality through dedicated endpoints
Provides integration with Langflow through the MCPAIComponent for use in LangFlow pipelines
Integrates with OpenAI API to provide text completion and chat functionality via dedicated endpoints
Enables querying and monitoring metrics through Prometheus, including instant queries, range queries, alerts, and targets
MCP Server with OpenAI, Git, Filesystem, and Prometheus Integration
This repository contains a Model Control Plane (MCP) server implementation that supports OpenAI services, Git repository analysis, local filesystem operations, and Prometheus integration.
Project Structure
Requirements
Python 3.8+
FastAPI
Uvicorn
OpenAI SDK
GitPython
Requests
Docker and Docker Compose (for Prometheus features)
Installation
Clone this repository
Install the dependencies:
Environment Variables
Set the following environment variables:
For Azure OpenAI:
For Standard OpenAI:
For Prometheus:
Running the Server
Start the MCP server:
Or for more options:
The server will be available at http://localhost:8000.
Unified Testing Tool
We provide a unified testing script that gives you a user-friendly interface to all testing functionality:
This interactive script provides:
Filesystem tests
Git integration tests
Memory analysis tools
Prometheus tests & memory stress
MCP server management
Environment setup
Individual Tests
You can also run individual tests directly:
Test the OpenAI integration:
Test the Git integration (provide a Git repository URL):
Test the Git diff functionality (analyze requirements compatibility):
Test the filesystem functionality:
Test the langflow integration with MCP:
Test the Prometheus integration:
Advanced Git Analysis
For more advanced Git repository analysis with AI recommendations:
You can also search for specific patterns in the repository:
Or analyze the last commit diff with AI insights:
Memory Analysis Tools
MCP includes several tools for memory monitoring and analysis:
You can also simulate memory pressure for testing:
Prometheus Integration
Setup
Start the Prometheus stack using Docker Compose:
This will start:
Prometheus server (accessible at http://localhost:9090)
Node Exporter (for host metrics)
cAdvisor (for container metrics)
For stress testing, you can start the memory stress container:
Or use the container test script:
Docker Configuration and Reset Scripts
This project includes multiple Docker configurations and reset scripts for reliable operation across different environments:
Docker Configurations
Standard Configuration (
docker-compose.yml
): Uses custom Dockerfiles for Prometheus and Langflow to ensure consistent permissions across systems.Bridge Network Configuration (
docker-compose.bridge.yml
): Alternative configuration that uses bridge networking for environments where host networking is problematic.
Custom Dockerfiles for Solving Permission Issues
The project uses custom Dockerfiles for both Prometheus and Langflow to solve common permission issues:
Dockerfile.prometheus: Sets up the Prometheus configuration with proper permissions for the
nobody
user.Dockerfile.langflow: Copies the components directory into the container without changing file ownership, allowing Langflow to access the components without permission errors.
This approach eliminates the need for volume mounts that can lead to permission conflicts across different machines and user configurations.
Reset Scripts
All Services Reset (
reset-all.sh
): Reset all containers with a single command.# Basic reset (rebuilds containers with existing volumes) ./reset-all.sh # Full reset (removes volumes and rebuilds containers) ./reset-all.sh --cleanIndividual Service Reset:
# Reset only Prometheus ./reset-prometheus.sh # Reset only Langflow ./reset-langflow.sh
These scripts ensure that the containers are properly configured with correct permissions and the latest code changes.
Troubleshooting
If you encounter permission issues:
Use the reset scripts to rebuild the containers
Check the logs with
docker compose logs <service_name>
Make sure any components added to Langflow are included in the Dockerfile.langflow
Cross-Machine Deployment
When deploying to a new machine:
Clone the repository
Make reset scripts executable:
chmod +x *.sh
Run the reset script:
./reset-all.sh
The custom Dockerfiles automatically handle all permission issues that might occur across different systems.
Using Prometheus Client
The MCPAIComponent
class includes Prometheus capabilities:
Useful PromQL Queries
CPU Usage:
rate(node_cpu_seconds_total{mode!="idle"}[1m])
Memory Usage:
node_memory_MemTotal_bytes - node_memory_MemAvailable_bytes
Disk Usage:
node_filesystem_avail_bytes{mountpoint="/"} / node_filesystem_size_bytes{mountpoint="/"}
Container CPU Usage:
rate(container_cpu_usage_seconds_total[1m])
Container Memory Usage:
container_memory_usage_bytes
API Endpoints
OpenAI Endpoints
GET
/v1/models
- List all available modelsGET
/v1/models/{model_id}
- Get information about a specific modelPOST
/v1/models/azure-gpt-4/completion
- Generate text completion using Azure OpenAIPOST
/v1/models/azure-gpt-4/chat
- Generate chat response using Azure OpenAIPOST
/v1/models/openai-gpt-chat/chat
- Generate chat response using OpenAI chat modelPOST
/v1/models/openai-gpt-completion/completion
- Generate text completion using OpenAI completion model
Git Integration Endpoints
POST
/v1/models/git-analyzer/analyze
- Analyze a Git repositoryPOST
/v1/models/git-analyzer/search
- Search a Git repository for files matching a patternPOST
/v1/models/git-analyzer/diff
- Get the diff of the last commit in a repository
Filesystem Endpoints
POST
/v1/models/filesystem/list
- List contents of a directoryPOST
/v1/models/filesystem/read
- Read a file's contentsPOST
/v1/models/filesystem/read-multiple
- Read multiple files at oncePOST
/v1/models/filesystem/write
- Write content to a filePOST
/v1/models/filesystem/edit
- Edit a file with multiple replacementsPOST
/v1/models/filesystem/mkdir
- Create a directoryPOST
/v1/models/filesystem/move
- Move a file or directoryPOST
/v1/models/filesystem/search
- Search for files matching a patternPOST
/v1/models/filesystem/info
- Get information about a file or directory
Prometheus Endpoints
POST
/v1/models/prometheus/query
- Execute an instant queryPOST
/v1/models/prometheus/query_range
- Execute a range queryPOST
/v1/models/prometheus/series
- Get series dataGET
/v1/models/prometheus/labels
- Get all available labelsPOST
/v1/models/prometheus/label_values
- Get values for a specific labelGET
/v1/models/prometheus/targets
- Get all targetsGET
/v1/models/prometheus/rules
- Get all rulesGET
/v1/models/prometheus/alerts
- Get all alerts
Client Usage
You can use the MCPAIComponent
in your LangFlow pipelines by providing the MCP server URL:
Using the GitCodeAnalyzer Class
For more structured Git analysis, you can use the GitCodeAnalyzer
class:
Troubleshooting
Prometheus Issues
Verify Prometheus is running:
docker ps | grep prometheus
Check you can access the Prometheus UI: http://localhost:9090
Verify the MCP server is running and accessible
Check the MCP server logs for errors
Try simple queries first to verify connectivity (e.g.,
up
query)
OpenAI Issues
Verify your API keys are set correctly
Check for rate limiting or quota issues
Verify you're using supported models for your API key
Git Issues
Ensure the Git repository URL is accessible
Check for authentication issues if using private repositories
Ensure GitPython is installed correctly
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A server implementation that provides a unified interface for OpenAI services, Git repository analysis, and local filesystem operations through REST API endpoints.
- Project Structure
- Requirements
- Installation
- Environment Variables
- Running the Server
- Unified Testing Tool
- Individual Tests
- Advanced Git Analysis
- Memory Analysis Tools
- Prometheus Integration
- API Endpoints
- Client Usage
- Using the GitCodeAnalyzer Class
- Troubleshooting
Related Resources
Related MCP Servers
- -securityAlicense-qualityThis server facilitates scalable discovery and execution of OpenAPI endpoints using semantic search and high-performance processing, overcoming limitations of large spec handling for streamlined API interactions.Last updated -73MIT License
- -securityAlicense-qualityA server that allows AI assistants to browse and read files from specified GitHub repositories, providing access to repository contents via the Model Context Protocol.Last updated -5MIT License
- -securityFlicense-qualityA custom server implementation that allows AI assistants to interact with GitLab repositories, providing capabilities for searching, fetching files, creating/updating content, and managing issues and merge requests.Last updated -1
- -securityFlicense-qualityA unified API server that enables interaction with multiple AI model providers like Anthropic and OpenAI through a consistent interface, supporting chat completions, tool calling, and context handling.Last updated -