local-only server
The server can only run on the client’s local machine because it depends on local resources.
Integrations
Supports Helm v3 operations for package management in Kubernetes, enabling installation, upgrades, and uninstallation of Helm charts through natural language commands.
Provides comprehensive access to Kubernetes functionality including resource management, deployment scaling, pod operations, security configuration, diagnostics, and monitoring through natural language.
Allows installation of the MCP tool directly from PyPI, with support for version-specific installations and development versions.
Kubectl MCP Tool
A Model Context Protocol (MCP) server for Kubernetes that enables AI assistants like Claude, Cursor, and others to interact with Kubernetes clusters through natural language.
⚠️ Known Issues
We are currently experiencing JSON parsing issues on our server. This has led to difficulties running MCP in:
- Claude
- Cursor
- Windsurf
I am actively working on resolving these issues. Given I'm handling the troubleshooting process independently, resolution may take some time as I'm conducting detailed tests for each service individually. If you can debug those issues, feel free to submit a Pull Request.
Your patience and continued support during this period are greatly appreciated. 🙏
Thank you for understanding!
Features
Core Kubernetes Operations
- Connect to a Kubernetes cluster
- List and manage pods, services, deployments, and nodes
- Create, delete, and describe pods and other resources
- Get pod logs and Kubernetes events
- Support for Helm v3 operations (installation, upgrades, uninstallation)
- kubectl explain and api-resources support
- Choose namespace for next commands (memory persistence)
- Port forward to pods
- Scale deployments and statefulsets
- Execute commands in containers
- Manage ConfigMaps and Secrets
- Rollback deployments to previous versions
- Ingress and NetworkPolicy management
- Context switching between clusters
Natural Language Processing
- Process natural language queries for kubectl operations
- Context-aware commands with memory of previous operations
- Human-friendly explanations of Kubernetes concepts
- Intelligent command construction from intent
- Fallback to kubectl when specialized tools aren't available
- Mock data support for offline/testing scenarios
- Namespace-aware query handling
Monitoring
- Cluster health monitoring
- Resource utilization tracking
- Pod status and health checks
- Event monitoring and alerting
- Node capacity and allocation analysis
- Historical performance tracking
- Resource usage statistics via kubectl top
- Container readiness and liveness tracking
Security
- RBAC validation and verification
- Security context auditing
- Secure connections to Kubernetes API
- Credentials management
- Network policy assessment
- Container security scanning
- Security best practices enforcement
- Role and ClusterRole management
- ServiceAccount creation and binding
- PodSecurityPolicy analysis
- RBAC permissions auditing
- Security context validation
Diagnostics
- Cluster diagnostics and troubleshooting
- Configuration validation
- Error analysis and recovery suggestions
- Connection status monitoring
- Log analysis and pattern detection
- Resource constraint identification
- Pod health check diagnostics
- Common error pattern identification
- Resource validation for misconfigurations
- Detailed liveness and readiness probe validation
Advanced Features
- Multiple transport protocols support (stdio, SSE)
- Integration with multiple AI assistants
- Extensible tool framework
- Custom resource definition support
- Cross-namespace operations
- Batch operations on multiple resources
- Intelligent resource relationship mapping
- Error explanation with recovery suggestions
- Volume management and identification
Architecture
Model Context Protocol (MCP) Integration
The Kubectl MCP Tool implements the Model Context Protocol (MCP), enabling AI assistants to interact with Kubernetes clusters through a standardized interface. The architecture consists of:
- MCP Server: A compliant server that handles requests from MCP clients (AI assistants)
- Tools Registry: Registers Kubernetes operations as MCP tools with schemas
- Transport Layer: Supports stdio, SSE, and HTTP transport methods
- Core Operations: Translates tool calls to Kubernetes API operations
- Response Formatter: Converts Kubernetes responses to MCP-compliant responses
Request Flow
Dual Mode Operation
The tool operates in two modes:
- CLI Mode: Direct command-line interface for executing Kubernetes operations
- Server Mode: Running as an MCP server to handle requests from AI assistants
Installation
For detailed installation instructions, please see the Installation Guide.
You can install kubectl-mcp-tool directly from PyPI:
For a specific version:
The package is available on PyPI: https://pypi.org/project/kubectl-mcp-tool/1.1.0/
Prerequisites
- Python 3.9+
- kubectl CLI installed and configured
- Access to a Kubernetes cluster
- pip (Python package manager)
Global Installation
Local Development Installation
Verifying Installation
After installation, verify the tool is working correctly:
Note: This tool is designed to work as an MCP server that AI assistants connect to, not as a direct kubectl replacement. The primary command available is kubectl-mcp serve
which starts the MCP server.
Usage with AI Assistants
Claude Desktop
Add the following to your Claude Desktop configuration at ~/.config/claude/mcp.json
(Windows: %APPDATA%\Claude\mcp.json
):
Cursor AI
Add the following to your Cursor AI settings under MCP by adding a new global MCP server:
Save this configuration to ~/.cursor/mcp.json
for global settings.
Note: Replace
/path/to/your/.kube/config
with the actual path to your kubeconfig file. On most systems, this is~/.kube/config
.
Windsurf
Add the following to your Windsurf configuration at ~/.config/windsurf/mcp.json
(Windows: %APPDATA%\WindSurf\mcp.json
):
Automatic Configuration
For automatic configuration of all supported AI assistants, run the provided installation script:
This script will:
- Install the required dependencies
- Create configuration files for Claude, Cursor, and WindSurf
- Set up the correct paths and environment variables
- Test your Kubernetes connection
Prerequisites
- kubectl installed and in your PATH
- A valid kubeconfig file
- Access to a Kubernetes cluster
- Helm v3 (optional, for Helm operations)
Examples
List Pods
Deploy an Application
Check Pod Logs
Port Forwarding
Development
Project Structure
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
This server cannot be installed
A Model Context Protocol server that enables AI assistants to interact with Kubernetes clusters through natural language, supporting core Kubernetes operations, monitoring, security, and diagnostics.