Search for:
Why this server?
This server exposes HTTP methods defined in an OpenAPI specification as tools, enabling interaction with APIs via the Model Context Protocol, effectively wrapping HTTP interfaces.
Why this server?
Provides integration with Atlassian products through the Model Context Protocol, allowing users to interact with JIRA tickets and Confluence pages, thereby providing a higher-level tool wrapping HTTP APIs.
Why this server?
MCP Server provides a simpler API to interact with the Model Context Protocol by allowing users to define custom tools and services to streamline workflows and processes, acting as a wrapper.
Why this server?
Every GenAIScript can be exposed as a MCP server automatically, providing a tool for automated HTTP API creation.
Why this server?
FastMCP is a comprehensive MCP server allowing secure and standardized data and functionality exposure to LLM applications, which suggests it's a framework to build such wrappers.
Why this server?
A foundation for building Model Context Protocol servers in Python, which could be used to create tools wrapping HTTP APIs.
Why this server?
An open standard server implementation that enables AI Agents to communicate with APIs and services through Model Context Protocol, useful for creating HTTP wrappers.
Why this server?
A TypeScript template for creating Model Context Protocol servers that enable AI models to utilize external tools, applicable for automatically building API wrappers.
Why this server?
A Model Context Protocol server that runs on Cloudflare Workers with OAuth login, allowing AI assistants like Claude to execute tools remotely through HTTP connections.
Why this server?
A demonstration server showing MCP implementation in Python with resource handling, useful for creating simple HTTP API wrappers.