Repository hosting for the MCP server code, accessed through cloning for installation.
Supported as a web retriever option for the quick_search tool, allowing search queries through Google's search engine.
Integrates with OpenAI's API for powering the research functionality, requiring an API key for operation.
š GPT Researcher MCP Server
Why GPT Researcher MCP?
While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
⨠Higher quality information
š Optimized context usage
š Comprehensive results
š§ Better reasoning for LLMs
Related MCP server: Docs Fetch MCP Server
š» Claude Desktop Demo
https://github.com/user-attachments/assets/ef97eea5-a409-42b9-8f6d-b82ab16c52a8
š Quick Start with Claude Desktop
Want to use this with Claude Desktop right away? Here's the fastest path:
Install dependencies:
git clone https://github.com/assafelovic/gptr-mcp.git pip install -r requirements.txtSet up your Claude Desktop config at
~/Library/Application Support/Claude/claude_desktop_config.json:{ "mcpServers": { "gptr-mcp": { "command": "python", "args": ["/absolute/path/to/gpt-researcher/gptr-mcp/server.py"], "env": { "OPENAI_API_KEY": "your-openai-key-here", "TAVILY_API_KEY": "your-tavily-key-here" } } } }Restart Claude Desktop and start researching! š
For detailed setup instructions, see the full Claude Desktop Integration section below.
Resources
research_resource: Get web resources related to a given task via research.
Primary Tools
deep_research: Performs deep web research on a topic, finding the most reliable and relevant informationquick_search: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more herewrite_report: Generate a report based on research resultsget_research_sources: Get the sources used in the researchget_research_context: Get the full context of the research
Prompts
research_query: Create a research query prompt
Prerequisites
Before running the MCP server, make sure you have:
Python 3.11 or higher installed
Important: GPT Researcher >=0.12.16 requires Python 3.11+
API keys for the services you plan to use:
You can also connect any other web search engines or MCP using GPTR supported retrievers. Check out the docs here
āļø Installation
Clone the GPT Researcher repository:
Install the gptr-mcp dependencies:
Set up your environment variables:
Copy the
.env.examplefile to create a new file named.env:
cp .env.example .envEdit the
.envfile and add your API keys and configure other settings:
OPENAI_API_KEY=your_openai_api_key TAVILY_API_KEY=your_tavily_api_key
You can also add any other env variable for your GPT Researcher configuration.
š Running the MCP Server
You can run the MCP server in several ways:
Method 1: Directly using Python
Method 2: Using the MCP CLI (if installed)
Method 3: Using Docker (recommended for production)
Quick Start
The simplest way to run with Docker:
For n8n Integration
If you need to connect to an existing n8n network:
Note: The Docker image uses Python 3.11 to meet the requirements of gpt-researcher >=0.12.16. If you encounter errors during the build, ensure you're using the latest Dockerfile from this repository.
Once the server is running, you'll see output indicating that the server is ready to accept connections. You can verify it's working by:
SSE Endpoint: Access the Server-Sent Events endpoint at http://localhost:8000/sse to get a session ID
MCP Communication: Use the session ID to send MCP messages to http://localhost:8000/messages/?session_id=YOUR_SESSION_ID
Testing: Run the test script with
python test_mcp_server.py
Important for Docker/n8n Integration:
The server binds to
0.0.0.0:8000to work with Docker containersUses SSE transport for web-based MCP communication
Session management requires getting a session ID from
/sseendpoint firstEach client connection needs a unique session ID for proper communication
š¦ Transport Modes & Best Practices
The GPT Researcher MCP server supports multiple transport protocols and automatically chooses the best one for your environment:
Transport Types
Transport | Use Case | When to Use |
STDIO | Claude Desktop, Local MCP clients | Default for local development |
SSE | Docker, Web clients, n8n integration | Auto-enabled in Docker |
Streamable HTTP | Modern web deployments | Advanced web deployments |
Automatic Detection
The server automatically detects your environment:
Environment Variables
Variable | Description | Default | Example |
| Force specific transport |
|
,
|
| Force Docker mode | Auto-detected |
|
Configuration Examples
For Claude Desktop (Local)
For Docker/Web Deployment
For n8n MCP Integration
Transport Endpoints
When using SSE or HTTP transports:
Health Check:
GET /healthSSE Endpoint:
GET /sse(get session ID)MCP Messages:
POST /messages/?session_id=YOUR_SESSION_ID
Best Practices
Local Development: Use default STDIO for Claude Desktop
Production: Use Docker with automatic SSE detection
Testing: Use health endpoints to verify connectivity
n8n Integration: Always use container networking with Docker
Web Deployment: Consider Streamable HTTP for modern clients
Integrating with Claude
You can integrate your MCP server with Claude using:
Claude Desktop Integration - For using with Claude desktop application on Mac
For detailed instructions, follow the link above.
š» Claude Desktop Integration
To integrate your locally running MCP server with Claude for Mac, you'll need to:
Make sure the MCP server is installed and running
Configure Claude Desktop:
Locate or create the configuration file at
~/Library/Application Support/Claude/claude_desktop_config.jsonAdd your local GPT Researcher MCP server to the configuration with environment variables
Restart Claude to apply the configuration
ā ļø Important: Environment Variables Required
Claude Desktop launches your MCP server as a separate subprocess, so you must explicitly pass your API keys in the configuration. The server cannot access your shell's environment variables or .env file automatically.
Configuration Example
Security Note
š Your Claude Desktop config contains sensitive API keys. Protect it:
Never commit this file to version control.
Alternative: Environment Variable Script
For better security, create a wrapper script:
run_gptr_mcp.sh:
Then use it in Claude Desktop:
For complete step-by-step instructions, see the Claude Desktop Integration guide.
š Example Usage with Claude
š§ Troubleshooting
If you encounter issues while running the MCP server:
General Issues
API Keys: Make sure your API keys are correctly set in the
.envfilePython Version: Check that you're using Python 3.11 or higher (required by gpt-researcher >=0.14.0)
Dependencies: Ensure all dependencies are installed correctly:
pip install -r requirements.txtServer Logs: Check the server logs for error messages
Docker Issues
Container not accessible:
Verify the container is running:
docker ps | grep gptr-mcpCheck container logs:
docker logs gptr-mcpConfirm the server is binding to 0.0.0.0:8000 (logs should show this)
n8n Integration Issues:
Ensure both containers are on the same Docker network
Use the container name
gptr-mcpas the hostname in n8nSet the MCP server URL to:
http://gptr-mcp:8000/sse
Session ID Issues:
The server uses SSE transport which requires session management
First, get a session ID by connecting to
/sseendpointUse the session ID in subsequent MCP requests:
/messages/?session_id=YOUR_IDEach client needs its own session ID
n8n MCP Integration Steps
Get Session ID:
curl http://gptr-mcp:8000/sse # Look for: data: /messages/?session_id=XXXXXInitialize MCP:
curl -X POST http://gptr-mcp:8000/messages/?session_id=YOUR_SESSION_ID \ -H "Content-Type: application/json" \ -d '{"jsonrpc": "2.0", "id": 1, "method": "initialize", "params": {"protocolVersion": "2024-11-05", "capabilities": {"roots": {"listChanged": true}}, "clientInfo": {"name": "n8n-client", "version": "1.0.0"}}}'Call Tools:
curl -X POST http://gptr-mcp:8000/messages/?session_id=YOUR_SESSION_ID \ -H "Content-Type: application/json" \ -d '{"jsonrpc": "2.0", "id": 2, "method": "tools/call", "params": {"name": "quick_search", "arguments": {"query": "test"}}}'
Testing the Server
Run the included test script to verify functionality:
This will test:
SSE connection and session ID retrieval
MCP initialization
Tool discovery and execution
Claude Desktop Issues
If your MCP server isn't working with Claude Desktop:
Server not appearing in Claude:
Check your
claude_desktop_config.jsonsyntax is valid JSONEnsure you're using absolute paths (not relative)
Verify the path to
server.pyis correctRestart Claude Desktop completely
"OPENAI_API_KEY not found" error:
Make sure you added API keys to the
envsection in your configDon't forget both
OPENAI_API_KEYandTAVILY_API_KEYAPI keys should be the actual keys, not placeholders
Tools not showing up:
Look for the š§ tools icon in Claude Desktop
Check that Claude Desktop config file is in the right location:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
Python/Permission issues:
Make sure Python is accessible from the command line:
python --versionTry using full Python path:
"command": "/usr/bin/python3"or"command": "python3"Check file permissions on your server.py file
Still not working?
Test the server manually:
python server.py(should show STDIO transport message)Check Claude Desktop logs (if available)
Try the alternative script method from the integration section above
š£ Next Steps
Explore the MCP protocol documentation to better understand how to integrate with Claude
Learn about GPT Researcher's core features to enhance your research capabilities
Check out the Advanced Usage guide for more configuration options
š License
This project is licensed under the MIT License - see the LICENSE file for details.