This MCP server performs multi-topic searches in business, news, finance, and politics using the Tavily API, providing high-quality sources and intelligent summaries.
Integrates Tavily's search API with LLMs to provide advanced web search capabilities, including intelligent result summaries, domain filtering for quality control, and configurable search parameters.
This server enables AI systems to integrate with Tavily's search and data extraction tools, providing real-time web information access and domain-specific searches.
Provides AI-optimized web search capabilities and direct answers using the Tavily API for MCP-compatible assistants. It enables configurable searches with granular control over search depth, result counts, and domain filtering.
Tavily MCP Server implementation that uses fastmcp and supports both sse and stdio transports. It also supports more up to date functionalities of Tavily.
Provides AI-powered web search capabilities using Tavily's search API, enabling LLMs to perform sophisticated web searches, get direct answers to questions, and search recent news articles.
Provides AI assistants with real-time web search capabilities, intelligent data extraction from web pages, website mapping, and web crawling through Tavily's API.
Provides AI assistants with real-time web search, intelligent data extraction from web pages, website mapping, and web crawling capabilities through Tavily's API. Enables comprehensive web research and content analysis through natural language interactions.
A multi-API key load balancing MCP server for Tavily that automatically rotates between multiple API keys to provide high availability and increased request limits.
A Model Context Protocol server that enables web search capabilities using the Tavily API, allowing AI models to retrieve current information from the internet through natural language commands.
Enables web search capabilities through the Tavily API. Allows users to search the web for information using natural language queries through the MCP protocol.
Enables web search capabilities through the Tavily API. Allows users to search the web for information using natural language queries via the MCP protocol.
Enables web search capabilities through the Tavily API, allowing users to search the internet for information using natural language queries. Demonstrates MCP server implementation with external API integration.
Enables web search capabilities through the Tavily API, allowing users to search the web for information using natural language queries. Demonstrates MCP (Model Context Protocol) server implementation with stdio transport mode.
Enables web search capabilities through the Tavily API, allowing users to search the web for information using natural language queries through the Model Context Protocol.
Enables web search capabilities through the Tavily API, allowing users to search the internet for information using natural language queries. Built as a demonstration MCP server running in stdio transport mode.
Enables web search capabilities through the Tavily API, allowing users to search the web for information using natural language queries. Demonstrates MCP server implementation with stdio transport mode for integration with LLM applications.
Enables web search capabilities through the Tavily API, allowing users to search the internet for information using natural language queries. Serves as a demonstration and educational project for building MCP servers with external API integrations.
Provides AI assistants with a standardized interface to interact with the Todo for AI task management system. It enables users to retrieve project tasks, create new entries, and submit completion feedback through natural language.
Provides access to Douyin (TikTok China) API for searching videos, retrieving user profiles, posts, comments, music, challenges, live streams, and hot trends through the Douyin platform.
Enables web search capabilities through the Tavily API, allowing users to perform web searches and retrieve information from the internet through natural language queries.
A Cloudflare Worker that transforms Cloudflare AI Search (AutoRAG) instances into an MCP server for querying documentation. It enables AI models to search and retrieve relevant information from custom document sets stored in R2 buckets.
Enables web search capabilities through the Tavily API, allowing users to perform web searches and retrieve information from the internet. Includes additional Polygon API integration for enhanced functionality.
Enables web search capabilities through the Tavily API via the Model Context Protocol. Allows users to perform web searches and retrieve information from the internet through natural language queries.
A server that enables document searching using Vertex AI with Gemini grounding, improving search results by grounding responses in private data stored in Vertex AI Datastore.
A standalone proxy that transforms any OpenAPI or Swagger-described REST API into an MCP server by mapping API operations to executable MCP tools. It enables AI clients to interact with existing web services through automated HTTP requests based on their official documentation.
Automatically converts Swagger/OpenAPI specifications into dynamic MCP tools, enabling interaction with any REST API through natural language by loading specs from local files or URLs.
Converts AI Skills (following Claude Skills format) into MCP server resources, enabling LLM applications to discover, access, and utilize self-contained skill directories through the Model Context Protocol. Provides tools to list available skills, retrieve skill details and content, and read supporting files with security protections.
Enables LLMs to perform sophisticated web searches through proxy servers using Tavily's API, supporting comprehensive web searches, direct question answering, and recent news article retrieval with AI-extracted content.
An MCP server that automates converting diverse content sources like WeChat articles, YouTube videos, and various document formats into AI-generated outputs such as podcasts and slide decks via Google NotebookLM. It integrates specialized tools for web scraping, OCR, and file transformation to facilitate seamless content generation through natural language.
Enables web search capabilities within MCP-compatible LLM clients using the Parallel Search API. Designed for daily use and everyday smaller web-search tasks.
A local-first MCP server that enables semantic search over PDF and DOCX documents using structure-aware parsing and vector storage. It allows users to query their local knowledge base through Claude Code without cloud dependencies or GPU requirements.
Enables semantic search over markdown files to find related notes by meaning rather than keywords, and automatically detect duplicate content before creating new notes.
Enables users to control the cursor in Figma through verbal commands using an agentic AI agent, streamlining the design process with a new interaction method.