Integrations
Repository hosting for the MCP server code, accessed through cloning for installation.
Supported as a web retriever option for the quick_search tool, allowing search queries through Google's search engine.
Integrates with OpenAI's API for powering the research functionality, requiring an API key for operation.
🔍 GPT Researcher MCP Server
Why GPT Researcher MCP?
While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
- ✨ Higher quality information
- 📊 Optimized context usage
- 🔎 Comprehensive results
- 🧠 Better reasoning for LLMs
💻 Claude Desktop Demo
https://github.com/user-attachments/assets/ef97eea5-a409-42b9-8f6d-b82ab16c52a8
Resources
research_resource
: Get web resources related to a given task via research.
Primary Tools
deep_research
: Performs deep web research on a topic, finding the most reliable and relevant informationquick_search
: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more herewrite_report
: Generate a report based on research resultsget_research_sources
: Get the sources used in the researchget_research_context
: Get the full context of the research
Prompts
research_query
: Create a research query prompt
Prerequisites
Before running the MCP server, make sure you have:
- Python 3.10 or higher installed
- API keys for the services you plan to use:
⚙️ Installation
- Clone the GPT Researcher repository:
- Install the gptr-mcp dependencies:
- Set up your environment variables:
- Copy the
.env.example
file to create a new file named.env
:
Copy- Edit the
.env
file and add your API keys and configure other settings:
Copy - Copy the
You can also add any other env variable for your GPT Researcher configuration.
🚀 Running the MCP Server
You can start the MCP server in two ways:
Method 1: Directly using Python
Method 2: Using the MCP CLI (if installed)
Once the server is running, you'll see output indicating that the server is ready to accept connections.
Integrating with Claude
You can integrate your MCP server with Claude using:
Claude Desktop Integration - For using with Claude desktop application on Mac
For detailed instructions, follow the link above.
💻 Claude Desktop Integration
To integrate your locally running MCP server with Claude for Mac, you'll need to:
- Make sure the MCP server is installed and running
- Configure Claude Desktop:
- Locate or create the configuration file at
~/Library/Application Support/Claude/claude_desktop_config.json
- Add your local GPT Researcher MCP server to the configuration
- Restart Claude to apply the configuration
- Locate or create the configuration file at
For complete step-by-step instructions, see the Claude Desktop Integration guide.
📝 Example Usage with Claude
🔧 Troubleshooting
If you encounter issues while running the MCP server:
- Make sure your API keys are correctly set in the
.env
file - Check that you're using Python 3.10 or higher
- Ensure all dependencies are installed correctly
- Check the server logs for error messages
👣 Next Steps
- Explore the MCP protocol documentation to better understand how to integrate with Claude
- Learn about GPT Researcher's core features to enhance your research capabilities
- Check out the Advanced Usage guide for more configuration options
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
📞 Support / Contact
This server cannot be installed
Enhances LLM applications with deep autonomous web research capabilities, delivering higher quality information than standard search tools by exploring and validating numerous trusted sources.