Provides tools for searching arXiv papers by topic and extracting paper metadata including titles, authors, abstracts, PDF links, and publish dates.
MCP Research Server (arXiv)
This project is a hands-on implementation of the Model Context Protocol (MCP) to demonstrate how an AI client can dynamically discover and call tools exposed by a server.
The server exposes research-related tools backed by arXiv, and the client uses an LLM (Mistral via Ollama) to decide when and how to call those tools.
This project is intentionally simple and educational, built to understand MCP fundamentals end-to-end.
What is MCP?
Model Context Protocol (MCP) is a standard that allows:
A server to expose tools (functions + schemas)
A client (LLM-powered) to discover available tools dynamically
The LLM to decide when to call a tool
Tool results to be fed back into the model for final reasoning
Think of MCP as a clean contract between:
Tools
Schemas
LLM reasoning
Execution
Project Architecture
MCP Research Server (arXiv)
This project is a hands-on implementation of the Model Context Protocol (MCP) to demonstrate how an AI client can dynamically discover and call tools exposed by a server.
The server exposes research-related tools backed by arXiv, and the client uses an LLM (Mistral via Ollama) to decide when and how to call those tools.
This project is intentionally simple and educational, built to understand MCP fundamentals end-to-end.
What is MCP?
Model Context Protocol (MCP) is a standard that allows:
A server to expose tools (functions + schemas)
A client (LLM-powered) to discover available tools dynamically
The LLM to decide when to call a tool
Tool results to be fed back into the model for final reasoning
Think of MCP as a clean contract between:
Tools
Schemas
LLM reasoning
Execution
Project Architecture
MCP Server (research_server.py)
The MCP server exposes two tools:
1. search_papers
Searches arXiv for papers by topic and stores metadata locally.
Inputs
topic (string)
max_results (int)
Returns
List of arXiv paper IDs
2. extract_info
Fetches stored metadata for a given paper ID.
Inputs
paper_id (string)
Returns
Title, authors, abstract, PDF link, publish date
The server runs using stdio transport, which is ideal for local development and MCP inspectors.
MCP Client (mcp_chatbot.py)
The client:
Connects to the MCP server over stdio
Discovers available tools dynamically
Uses Mistral (via Ollama) to decide:
Whether a tool is needed
Which tool to call
What arguments to pass
Calls the tool through MCP
Feeds the tool result back to the LLM for a final response
The LLM never hardcodes tool logic — it reasons based on tool schemas.
LLM Used
Model: Mistral
Runtime: Ollama (local)
Reason: Fast, local, great for experimentation without API costs
Make sure Ollama is running: