# MCP Research Assistant π§
A comprehensive Model Context Protocol (MCP) setup that provides powerful tools for research, file management, and web content fetching. This project integrates multiple MCP servers to enhance your AI assistant capabilities.
## β¨ Features
- **π Research Tool**: Search and manage academic papers from arXiv
- **π Filesystem Tool**: Browse, read, and manage project files
- **π Fetch Tool**: Retrieve content from websites and APIs
- **π€ Multi-LLM Support**: Works with Claude, Gemini, and other AI models
- **πΎ Local Storage**: Automatically saves research data organized by topics
## π οΈ Prerequisites
- Python 3.13 or higher
- `uv` package manager (recommended) or `pip`
- API keys for your chosen LLM providers
- Claude Desktop (for MCP integration)
## π» Quick Start
### 1. Clone and Setup
```bash
git clone <your-repo-url>
cd mcp_project
```
### 2. Install Dependencies
```bash
# Install uv if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create virtual environment and install dependencies
uv sync
```
### 3. Configure Environment Variables
Create a `.env` file in your project root:
```env
ANTHROPIC_API_KEY=your_anthropic_api_key_here
```
### 4. Configure Claude Desktop
Create or update your Claude Desktop configuration file:
**Location**: `~/Library/Application Support/Claude/claude_desktop_config.json` (macOS)
```json
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"."
],
"cwd": "/path/to/your/mcp_project"
},
"research": {
"command": "/path/to/your/mcp_project/.venv/bin/python",
"args": [
"/path/to/your/mcp_project/research_server.py"
],
"cwd": "/path/to/your/mcp_project"
},
"fetch": {
"command": "/path/to/your/.local/bin/uvx",
"args": ["mcp-server-fetch"],
"cwd": "/path/to/your/mcp_project"
}
}
}
```
**Important**: Replace `/path/to/your/mcp_project` with your actual project path.
### 5. Restart Claude Desktop
Restart Claude Desktop completely to load the new configuration.
## π― How to Use
### Research Tool π¬
**Search for Papers:**
```
Search for 5 papers about machine learning
```
**Get Paper Details:**
```
Show me information about paper ID 1234.5678
```
**Browse Saved Papers:**
```
What papers do I have saved on physics?
```
### Filesystem Tool π
**Browse Files:**
```
List all files in my project directory
```
**Read Files:**
```
Show me the contents of research_server.py
```
**Create Files:**
```
Create a new Python script for data analysis
```
### Fetch Tool π
**Get Web Content:**
```
Fetch the latest Python documentation
```
**API Calls:**
```
Get current weather data from an API
```
## π Available Tools
### Research Server Tools
| Tool | Description | Parameters |
|------|-------------|------------|
| `search_papers` | Search arXiv for papers | `topic`, `max_results` |
| `extract_info` | Get paper details | `paper_id` |
| `get_available_folders` | List saved topics | None |
### Filesystem Server Tools
| Tool | Description |
|------|-------------|
| `read_file` | Read file contents |
| `write_file` | Write to files |
| `list_dir` | List directory contents |
| `delete_file` | Delete files |
### Fetch Server Tools
| Tool | Description |
|------|-------------|
| `fetch` | Fetch content from URLs |
## π Project Structure
```
mcp_project/
βββ research_server.py # Main research MCP server
βββ mcp_chatbot_L7.py # Chatbot with LLM integration
βββ pyproject.toml # Project configuration
βββ requirements.txt # Python dependencies
βββ uv.lock # Dependency lock file
βββ papers/ # Research data storage
β βββ [topic_name]/ # Organized by topic
β βββ papers_info.json # Paper metadata
βββ .env # Environment variables
βββ README.md # This file
```
## π§ Configuration Details
### Research Server Configuration
The research server automatically:
- Creates topic-based directories in `papers/`
- Saves paper metadata as JSON files
- Provides search and retrieval functions
- Integrates with arXiv API
### Filesystem Server Configuration
The filesystem server:
- Operates within your project directory
- Provides full file management capabilities
- Uses relative paths for portability
### Fetch Server Configuration
The fetch server:
- Handles web requests and API calls
- Supports custom user agents
- Can ignore robots.txt restrictions

*Screenshot showing the MCP Research Assistant successfully running with all tools working*
## π Development
### Adding New Tools
1. Edit `research_server.py` to add new functions
2. Use the `@mcp.tool()` decorator
3. Test with MCP Inspector
4. Update documentation
### Customizing LLM Behavior
1. Edit `mcp_chatbot_L7.py`
2. Modify tool descriptions and parameters
3. Add custom prompts and resources