Skip to main content
Glama

Kubernetes MCP Server

by kishanrao92

Kubernetes MCP Server

An interactive Kubernetes monitoring system built with Flask and OpenAI's Model Context Protocol (MCP). This project provides an agentic interface for diagnosing cluster issues using natural language queries.

Features

  • Flask MCP Server: Exposes Kubernetes cluster data via JSON-RPC endpoints

  • Interactive Client: Ask questions like "What is the status of the checkout service?"

  • OpenAI Integration: Uses GPT models to intelligently investigate cluster problems

  • Kubernetes Integration: Real-time pod monitoring, events, and logs

  • Colorized Output: Beautiful terminal interface with ANSI colors

Architecture

┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐ │ Interactive │───▶│ Flask MCP │───▶│ Kubernetes │ │ Client │ │ Server │ │ Cluster │ │ (client.py) │ │ (server.py) │ │ (KIND/etc) │ └─────────────────┘ └──────────────────┘ └─────────────────┘ │ │ ▼ ▼ ┌─────────────────┐ ┌──────────────────┐ │ OpenAI GPT │ │ Static Fixtures │ │ (optional) │ │ (metrics, etc) │ └─────────────────┘ └──────────────────┘

Setup

Prerequisites

  • Python 3.9+

  • Kubernetes cluster (KIND recommended for local development)

  • OpenAI API key (optional, fallback mode available)

Installation

  1. Clone the repository:

git clone https://github.com/YOUR_USERNAME/YOUR_REPO_NAME.git cd YOUR_REPO_NAME
  1. Install dependencies:

pip install flask kubernetes openai requests
  1. Set up environment variables:

export OPENAI_API_KEY="your-api-key-here" # Optional export KUBECONFIG="path/to/your/kubeconfig" # If not using default

Running the Server

cd mcp python3 server.py

The server will start on http://localhost:5050

Running the Interactive Client

cd mcp python3 client.py

Usage

Interactive Mode

Start the client and ask natural language questions:

> what is the status of my checkout service? > show failing pods in namespace staging > summarize errors for service payments in the last 45 minutes

One-shot Mode

python3 client.py --ask "what pods are failing in default namespace?"

Available Tools

  • k8s.listProblemPods - Find problematic pods

  • k8s.getPodDetails - Get detailed pod information

  • deployments.listRecentChanges - Recent deployment history

  • metrics.getErrors - Error rate analysis

  • traces.sampleErrors - Sample failing traces

  • config.getDiff - Configuration changes

Example Output

=== 🧩 FINAL ANSWER === 📋 Summary: The pod 'demo-fail-5df44cbf79-tqg6l' is experiencing CrashLoopBackOff 🔍 Evidence: • Pod: demo-fail-5df44cbf79-tqg6l Status: Running Restarts: 115 Reason: CrashLoopBackOff ⚠️ Probable Cause: Application failing to start successfully due to exit code 1 🛠️ Safe Next Step: Investigate application logs and configuration ✅ Confidence: High

Configuration

Environment variables:

  • RPC_URL - MCP server URL (default: http://127.0.0.1:5050/rpc)

  • OPENAI_API_KEY - OpenAI API key for LLM features

  • OPENAI_MODEL - Model to use (default: gpt-4o-mini)

  • SERVICE - Default service name (default: checkout)

  • NAMESPACE - Default K8s namespace (default: default)

  • SINCE_MINS - Time window for queries (default: 120)

Development

Project Structure

mcp-demo/ ├── mcp/ │ ├── server.py # Flask MCP server │ ├── client.py # Interactive client │ ├── tools_catalog.json # Tool definitions │ └── fixtures/ # Static test data ├── k8s/ │ └── deployment.yaml # Sample K8s resources └── README.md

Adding New Tools

  1. Add tool definition to tools_catalog.json

  2. Implement handler in server.py

  3. Test with client

Demo

https://github.com/user-attachments/assets/e30a7a69-ff7a-46f1-a2ff-e75eff79334b

License

MIT License

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables interactive Kubernetes cluster monitoring and troubleshooting through natural language queries. Users can diagnose pod issues, check service status, and investigate cluster problems using conversational AI.

  1. Features
    1. Architecture
      1. Setup
        1. Prerequisites
        2. Installation
        3. Running the Server
        4. Running the Interactive Client
      2. Usage
        1. Interactive Mode
        2. One-shot Mode
        3. Available Tools
      3. Example Output
        1. Configuration
          1. Development
            1. Project Structure
            2. Adding New Tools
            3. Demo
          2. License

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/kishanrao92/infra-mcp'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server