Search for:
Why this server?
This server enables seamless container and compose stack management through Claude AI.
Why this server?
A production-ready template for building Model Context Protocol servers that can integrate with AI systems, providing a simple BMI calculator tool as an example implementation.
Why this server?
Enables Claude to perform web searches, extract webpage content, and capture screenshots directly from conversations.
Why this server?
A Cloudflare Workers-based implementation of Model Context Protocol server with OAuth login, allowing Claude and other MCP clients to connect to remote tools.
Why this server?
Provides access to various AI tools through Model Context Protocol, allowing Claude Desktop users to integrate and use Superface capabilities via API.
Why this server?
A Cloudflare Workers-based implementation of Model Context Protocol (MCP) server that enables AI assistants like Claude to access external tools and capabilities through a standardized interface with OAuth authentication.
Why this server?
A Python server implementing the Model Context Protocol to provide customizable prompt templates, resources, and tools that enhance LLM interactions in the continue.dev environment.
Why this server?
The Fibery MCP server provides integration between Fibery and any LLM provider supporting the MCP protocol (e.g., Claude for Desktop), allowing you to explore Fibery Workspace Schema, query databases, create and update entities using natural language.
Why this server?
A Python application that allows creating and editing Microsoft Word documents through an API, built with FastMCP for interacting with Word documents.
Why this server?
Low-code MCP middleware development using microservices architecture for scalable and automated MCP server functionality.