Browse 508 MCP Connectors from the official MCP Registry. Connect to these servers directly without local installation.
Matching connector tools:
The Google Compute Engine MCP server is a fully-managed Model Context Protocol server that provides tools to manage Google Compute Engine resources through AI agents. It enables capabilities including instance management (creating, starting, stopping, resetting, listing), disk management, handling instance templates and group managers, viewing machine and accelerator types, managing images, and accessing reservation and commitment information. The server operates as a zero-deployment, enterprise-grade endpoint at https://compute.googleapis.com/mcp with built-in IAM-based security.
The Google GKE MCP server is a managed Model Context Protocol server that provides AI applications with tools to manage Google Kubernetes Engine (GKE) clusters and Kubernetes resources. It exposes a structured, discoverable interface that allows AI agents to interact with GKE and Kubernetes APIs, enabling them to inspect cluster configurations, retrieve Kubernetes resource YAMLs, monitor operations like cluster upgrades, diagnose issues, and optimize costs—all without needing to parse text output or use complex kubectl commands.
The AWS Knowledge MCP server is a fully managed remote Model Context Protocol server that provides real-time access to official AWS content in an LLM-compatible format. It offers structured access to AWS documentation, code samples, blog posts, What's New announcements, Well-Architected best practices, and regional availability information for AWS APIs and CloudFormation resources. Key capabilities include searching and reading documentation in markdown format, getting content recommendations, listing AWS regions, and checking regional availability for services and features.
An MCP server for deep research or task groups
Semilattice lets agents make predictions about audiences for content testing and decision making.
The Remote MCP server acts as a standardized bridge between LLM applications (like Claude, ChatGPT, and Cursor) and external services, enabling AI agents to access external tools and resources. Its primary capability is providing a centralized search tool to discover other MCP servers and their respective tools. Unlike local implementations, it runs remotely with OAuth authentication and permission controls for security.