Search for:
Why this server?
This server provides image analysis capabilities powered by OpenRouter vision models, allowing your agent to analyze images using the OpenRouter API.
Why this server?
Enables interaction with Google Cloud Platform resources. This could potentially allow your agent to use Vertex AI if properly configured within your GCP environment.
Why this server?
Provides a unified API to multiple AI providers including Anthropic and OpenAI, thus, enabling access to a variety of models through a single endpoint, useful if you want to use openrouter.
Why this server?
Allows sending requests to OpenAI, MistralAI, Anthropic, xAI, or Google AI using MCP protocol via tool or predefined prompts. Supports both STDIO and SSE transport mechanisms.
Why this server?
Provides access to various AI tools through Model Context Protocol, allowing Claude Desktop users to integrate and use Superface capabilities via API.
Why this server?
Facilitates interaction with AWS Bedrock-enabled tools, allowing tool integration and communication. Bedrock offers access to various models including those accessible through OpenRouter.
Why this server?
Provides Google's Gemini language model capabilities via the MCP, a way to use google vertex api through the MCP.
Why this server?
An MCP server that enables LLMs to interact with Dust AI agents, allowing integration with development environments and Claude Desktop.
Why this server?
A middleware API that connects AI assistants like ChatGPT to Captain Data tools for extracting information from LinkedIn company and profile pages.
Why this server?
Provides integration between Merge API and LLM providers supporting the MCP protocol, allowing natural language interaction with Merge data across HRIS, ATS, and other categories.