Enables searching academic papers and preprints on the arXiv repository, with support for parallel searches and content extraction
Supports deployment of the MCP server to Cloudflare Workers platform for hosting and scalability
Provides web search capabilities across the entire web for current information and news, plus image search functionality similar to Google Images
Jina AI Remote MCP Server
A remote Model Context Protocol (MCP) server that provides access to Jina Reader, Embeddings and Reranker APIs with a suite of URL-to-markdown, web search, image search, and embeddings/reranker tools:
Tool | Description | Is Jina API Key Required? |
| Get current contextual information for localized, time-aware responses | No |
| Extract clean, structured content from web pages as markdown via | Optional* |
| Capture high-quality screenshots of web pages via | Optional* |
| Analyze web pages for last update/publish datetime with confidence scores | No |
| Search the entire web for current information and news via | Yes |
| Search academic papers and preprints on arXiv repository via | Yes |
| Search for images across the web (similar to Google Images) via | Yes |
| Expand and rewrite search queries based on the query expansion model via | Yes |
| Read multiple web pages in parallel for efficient content extraction via | Optional* |
| Run multiple web searches in parallel for comprehensive topic coverage and diverse perspectives via | Yes |
| Run multiple arXiv searches in parallel for comprehensive research coverage and diverse academic angles via | Yes |
| Rerank documents by relevance to a query via | Yes |
| Get top-k semantically unique strings via and | Yes |
| Get top-k semantically unique images via and | Yes |
Optional tools work without an API key but have rate limits. For higher rate limits and better performance, use a Jina API key. You can get a free Jina API key from https://jina.ai
Usage
For client that supports remote MCP server:
For client that does not support remote MCP server yet, you need mcp-remote a local proxy to connect to the remote MCP server.
For Claude Code:
For OpenAI Codex: find ~/.codex/config.toml and add the following:
Troubleshooting
I got stuck in a tool calling loop - what happened?
This is a common issue with LMStudio when the default context window is 4096 and you're using a thinking model like gpt-oss-120b or qwen3-4b-thinking. As the thinking and tool calling continue, once you hit the context window limit, the AI starts losing track of the beginning of the task. That's how it gets trapped in this rolling context window.
The solution is to load the model with enough context length to contain the full tool calling chain and thought process.

I can't see all tools.
Some MCP clients have local caching and do not actively update tool definitions. If you're not seeing all the available tools or if tools seem outdated, you may need to remove and re-add the jina-mcp-server to your MCP client configuration. This will force the client to refresh its cached tool definitions. In LMStudio, you can click the refresh button to load new tools.

Claude Desktop says "Server disconnected" on Windows
Cursor and Claude Desktop (Windows) have a bug where spaces inside args aren't escaped when it invokes npx, which ends up mangling these values. You can work around it using:
Cursor shows a red dot on this MCP status
Likely a UI bug from Cursor, but the MCP works correctly without any problem. You can toggle off/on to "restart" the MCP if you find the red dot annoying (fact is, since you are using this as a remote MCP, it's not a real "server restart" but mostly a local proxy restart).

My LLM never uses some tools
Assuming all tools are enabled in your MCP client but LLM still never uses some tools or favors some over others, this is pretty common when an LLM is trained with a specific set of tools. For example, we rarely see parallel_* tools being used organically by LLMs unless they are explicitly instructed to do so. Some research says LLMs must be trained to use . Models like Qwen3-Next natively prefer to call the singleton version but with multiple queries in an array to achieve parallelism (which our MCP also support now). Either way, in Cursor, you can add the following rule to your .mdc file:
Developer Guide
Local Development
Deploy to Cloudflare Workers
This will deploy your MCP server to a URL like: jina-mcp-server.<your-account>.workers.dev/sse
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Provides access to Jina AI's web reading, search, embeddings, and reranking capabilities. Enables URL content extraction, web/arXiv/image search, document deduplication, and relevance ranking through natural language.