Skip to main content
Glama
ApoorvBrooklyn

PM Counter Monitoring MCP Server

PM Counter Monitoring System

A comprehensive system for monitoring telecom performance monitoring (PM) counters from remote SFTP locations, storing them in a time-series database, and providing access through API endpoints and a Streamlit chat interface.

Architecture

Remote SFTP Location → Job Server (periodic fetch) → PostgreSQL Database ↓ MCP Server ← API Endpoints ← Streamlit Frontend

Components

  1. SFTP Client (sftp_client.py) - Handles file downloads from remote SFTP server

  2. Job Server (job_server.py) - Periodically fetches and processes XML files

  3. XML Parser (xml_parser.py) - Parses PM counter XML files

  4. Database (database.py) - PostgreSQL schema and models

  5. Data Storage (data_storage.py) - Saves parsed data to database

  6. API Server (api_server.py) - FastAPI REST endpoints

  7. MCP Server (mcp_server.py) - Model Context Protocol server

  8. Streamlit Frontend (streamlit_app.py) - Chat bot interface

The easiest way to run the entire system is using Docker Compose:

# Build and start all services make build make up # Or using docker-compose directly docker-compose up -d # Initialize database schema make init-db # View logs make logs # Access the application # - Streamlit: http://localhost:8501 # - API: http://localhost:8000 # - MCP Server: http://localhost:8001

The Docker setup includes:

  • PostgreSQL database

  • SFTP server (for testing, with example XML files)

  • Job server (fetches files every hour)

  • API server

  • MCP server

  • Streamlit frontend

All services are automatically configured to work together.

Manual Setup (Without Docker)

1. Install Dependencies

pip install -r requirements.txt

2. Configure Environment

Copy .env.example to .env and update with your settings:

cp .env.example .env

Edit .env with your database and SFTP credentials.

3. Setup Remote Location (SFTP Server)

The remote location is where your XML files are stored. You have several options:

Option A: Use Local Files for Testing (Easiest)

# Process local XML files directly (no SFTP needed) python test_local_files.py

Option B: Set Up Local SFTP Server See SETUP_REMOTE.md for detailed instructions on setting up a local SFTP server.

Option C: Use Existing Remote SFTP Server Update .env with your remote SFTP server credentials:

SFTP_HOST=your-sftp-server.com SFTP_USERNAME=your_username SFTP_PASSWORD=your_password SFTP_REMOTE_PATH=/path/to/xml/files

For more details, see SETUP_REMOTE.md.

4. Setup PostgreSQL Database

# Create database createdb pm_counters_db # Or using psql psql -U postgres -c "CREATE DATABASE pm_counters_db;"

5. Initialize Database Schema

from database import init_db init_db()

Or run:

python -c "from database import init_db; init_db()"

Running the System

# Start all services make up # Or docker-compose up -d # View logs make logs # Stop all services make down

Without Docker

1. Start Job Server

The job server fetches files from SFTP at configured intervals:

python job_server.py

2. Start API Server

python api_server.py

Or using uvicorn:

uvicorn api_server:app --host 0.0.0.0 --port 8000

3. Start MCP Server

python mcp_server.py

Or using uvicorn:

uvicorn mcp_server:app --host 0.0.0.0 --port 8001

4. Start Streamlit Frontend

streamlit run streamlit_app.py

Docker Commands

Use the Makefile for convenient commands:

make build # Build Docker images make up # Start all services make down # Stop all services make restart # Restart all services make logs # View logs from all services make logs-job # View logs from job server only make logs-api # View logs from API server only make logs-streamlit # View logs from Streamlit only make clean # Stop and remove everything (including volumes) make init-db # Initialize database schema make ps # Show running containers make shell-api # Open shell in API server container make shell-job # Open shell in job server container

Or use docker-compose directly:

docker-compose up -d # Start services docker-compose down # Stop services docker-compose logs -f # View logs docker-compose exec api_server bash # Open shell

Configuration

Changing Fetch Interval

The fetch interval can be configured in two ways:

  1. Environment Variable: Set FETCH_INTERVAL_HOURS in .env file (for Docker) or environment

  2. Docker Compose: Update FETCH_INTERVAL_HOURS in docker-compose.yml or .env file

For Docker, update the environment variable and restart the job server:

# Edit .env file FETCH_INTERVAL_HOURS=2.0 # Restart job server docker-compose restart job_server

For non-Docker, update Config.FETCH_INTERVAL_HOURS in config.py or set environment variable.

API Endpoints

Main API (Port 8000)

  • GET / - API information

  • GET /network-elements - List all network elements

  • GET /interfaces/{interface_name}/counters - Get interface counters

  • GET /system/counters - Get system counters

  • GET /cpu/utilization - Get CPU utilization

  • GET /memory/utilization - Get memory utilization

  • GET /bgp/peers - List BGP peers

  • GET /bgp/peers/{peer_address}/counters - Get BGP peer counters

  • GET /files/processed - List processed files

  • GET /stats/summary - Get summary statistics

MCP Server (Port 8001)

  • POST /mcp - MCP protocol endpoint

  • GET /mcp/methods - List available MCP methods

MCP Methods:

  • get_interface_counters - Get interface counters

  • get_system_counters - Get system counters

  • get_cpu_utilization - Get CPU utilization

  • get_memory_utilization - Get memory utilization

  • get_latest_metrics - Get latest metrics summary

Streamlit Chat Interface

The Streamlit frontend provides a chat bot interface where you can ask questions like:

  • "What is the current CPU utilization?"

  • "Show me memory usage for the last 12 hours"

  • "Get interface counters for GigabitEthernet1/0/1"

  • "What are the latest metrics?"

  • "Show me system statistics"

Database Schema

The system stores data in the following tables:

  • file_records - Track downloaded XML files

  • network_elements - Network element information

  • measurement_intervals - Time intervals for measurements

  • interface_counters - Interface performance counters

  • ip_counters - IP layer counters

  • tcp_counters - TCP layer counters

  • system_counters - System performance counters

  • bgp_counters - BGP peer counters

  • threshold_alerts - Threshold alerts from XML files

Testing with Local Files

For testing without a real SFTP server, you can:

  1. Use the existing example_1.xml and example_2.xml files

  2. Modify the job server to process local files directly

  3. Use a local SFTP server like openssh-server for testing

Troubleshooting

  1. Database Connection Issues: Ensure PostgreSQL is running and credentials are correct

  2. SFTP Connection Issues: Verify SFTP server is accessible and credentials are correct

  3. API Not Responding: Check if services are running on correct ports

  4. No Data: Ensure job server has processed files and data is in the database

License

MIT License

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ApoorvBrooklyn/Networking-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server