Search for:
Why this server?
FastMCP is a comprehensive MCP server allowing secure and standardized data and functionality exposure to LLM applications, offering resources, tools, and prompt management for efficient LLM interactions.
Why this server?
A Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.
Why this server?
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
Why this server?
Enables LLMs to interact with Dify AI's chat completion API, including conversation context support and a restaurant recommendation tool.
Why this server?
Query OpenAI models directly from Claude using MCP protocol.
Why this server?
Easily find MCP servers using our MCP registry. Search with natural language.
Why this server?
Send requests to OpenAI, MistralAI, Anthropic, xAI, or Google AI using MCP protocol via tool or predefined prompts. Vendor API key required
Why this server?
MCP server for seamless document format conversion using Pandoc, supporting Markdown, HTML, PDF, DOCX (.docx), csv and more.
Why this server?
MCP Server simplifies the implementation of the Model Context Protocol by providing a user-friendly API to create custom tools and manage server workflows efficiently.
Why this server?
A Model Context Protocol (MCP) server for web research. Bring real-time info into Claude and easily research any topic.