Search for:
Why this server?
Provides MCP multi-cluster Kubernetes management and operations, featuring a management interface, logging, and nearly 50 built-in tools covering common DevOps and development scenarios.
Why this server?
Provides MCP multi-cluster Kubernetes management and operations. It can be integrated as an SDK into your own project and includes nearly 50 built-in tools covering common DevOps and development scenarios.
Why this server?
Enables your LLM to list Azure resources, update/create/delete them, fix errors (by looking at the logs), fix security issues through Azure CLI commands.
Why this server?
Provides a convenient API for interacting with Azure DevOps services, enabling AI assistants to manage work items, code repositories, boards, sprints, and more.
Why this server?
A Model Context Protocol server that enables AI assistants to interact with Kubernetes clusters through natural language, supporting core Kubernetes operations, monitoring, security, and diagnostics.
Why this server?
Provides integration with Atlassian products through the Model Context Protocol, allowing users to interact with JIRA tickets and Confluence pages.
Why this server?
An integration that enables AI assistants to interact with network data through a standardized protocol, providing AI-ready tools and interfaces for network automation and management.
Why this server?
A Model Context Protocol server that enables LLMs to interact with Plane.so, allowing them to manage projects and issues through Plane's API for streamlined project management workflows.
Why this server?
Provides a Model Context Protocol server implementation that allows AI agents and other MCP clients to programmatically interact with DefectDojo, a vulnerability management tool, for managing findings, products, and engagements.
Why this server?
A server that supercharges AI assistants with powerful tools for software development, enabling research, planning, code generation, and project scaffolding through natural language interaction.