The OpenAI WebSearch MCP Server enables AI assistants to perform intelligent web searches with advanced reasoning capabilities and real-time information access.
Core Capabilities:
Intelligent web search using OpenAI's reasoning models for up-to-date information retrieval
Multi-model support including GPT-4o, GPT-4o-mini, GPT-5, GPT-5-mini, GPT-5-nano, o3, and o4-mini
Configurable reasoning effort with low, medium, high, or minimal levels for optimal performance
Multiple search modes - fast iterations with lightweight models or comprehensive deep research
Localized search results based on user location (city, timezone, region, country)
Adjustable search context with low, medium, or high settings
Multiple API versions including web_search_preview and web_search_preview_2025_03_11
Integration & Configuration:
Seamless integration with AI assistants like Claude Desktop, Cursor, and Claude Code via MCP protocol
Easy configuration through environment variables for API keys and default models
Flexible parameters for customizing each search query including input type and model selection
Provides access to OpenAI's websearch tool to query for current information from the web
OpenAI WebSearch MCP Server 🔍
An advanced MCP server that provides intelligent web search capabilities using OpenAI's reasoning models. Perfect for AI assistants that need up-to-date information with smart reasoning capabilities.
✨ Features
🧠 Reasoning Model Support: Full compatibility with OpenAI's latest reasoning models (gpt-5, gpt-5-mini, gpt-5-nano, o3, o4-mini)
⚡ Smart Effort Control: Intelligent
reasoning_effortdefaults based on use case🔄 Multi-Mode Search: Fast iterations with gpt-5-mini or deep research with gpt-5
🌍 Localized Results: Support for location-based search customization
📝 Rich Descriptions: Complete parameter documentation for easy integration
🔧 Flexible Configuration: Environment variable support for easy deployment
Related MCP server: MCP Google Server
🚀 Quick Start
One-Click Installation for Claude Desktop
Replace sk-xxxx with your OpenAI API key from the OpenAI Platform.
⚙️ Configuration
Claude Desktop
Add to your claude_desktop_config.json:
Cursor
Add to your MCP settings in Cursor:
Open Cursor Settings (
Cmd/Ctrl + ,)Search for "MCP" or go to Extensions → MCP
Add server configuration:
Claude Code
Claude Code automatically detects MCP servers configured for Claude Desktop. Use the same configuration as above for Claude Desktop.
Local Development
For local testing, use the absolute path to your virtual environment:
🛠️ Available Tools
openai_web_search
Intelligent web search with reasoning model support.
Parameters
Parameter | Type | Description | Default |
|
| The search query or question to search for | Required |
|
| AI model to use. Supports gpt-4o, gpt-4o-mini, gpt-5, gpt-5-mini, gpt-5-nano, o3, o4-mini |
|
|
| Reasoning effort level: low, medium, high, minimal | Smart default |
|
| Web search API version |
|
|
| Context amount: low, medium, high |
|
|
| Optional location for localized results |
|
💬 Usage Examples
Once configured, simply ask your AI assistant to search for information using natural language:
Quick Search
"Search for the latest developments in AI reasoning models using openai_web_search"
Deep Research
"Use openai_web_search with gpt-5 and high reasoning effort to provide a comprehensive analysis of quantum computing breakthroughs"
Localized Search
"Search for local tech meetups in San Francisco this week using openai_web_search"
The AI assistant will automatically use the openai_web_search tool with appropriate parameters based on your request.
🤖 Model Selection Guide
Quick Multi-Round Searches 🚀
Recommended:
gpt-5-miniwithreasoning_effort: "low"Use Case: Fast iterations, real-time information, multiple quick queries
Benefits: Lower latency, cost-effective for frequent searches
Deep Research 🔬
Recommended:
gpt-5withreasoning_effort: "medium"or"high"Use Case: Comprehensive analysis, complex topics, detailed investigation
Benefits: Multi-round reasoned results, no need for agent iterations
Model Comparison
Model | Reasoning | Default Effort | Best For |
| ❌ | N/A | Standard search |
| ❌ | N/A | Basic queries |
| ✅ |
| Fast iterations |
| ✅ |
| Deep research |
| ✅ |
| Balanced approach |
| ✅ |
| Advanced reasoning |
| ✅ |
| Efficient reasoning |
📦 Installation
Using uvx (Recommended)
Using pip
From Source
👩💻 Development
Setup Development Environment
Environment Variables
Variable | Description | Default |
| Your OpenAI API key | Required |
| Default model to use |
|
🐛 Debugging
Using MCP Inspector
Common Issues
Issue: "Unsupported parameter: 'reasoning.effort'" Solution: This occurs when using non-reasoning models (gpt-4o, gpt-4o-mini) with reasoning_effort parameter. The server automatically handles this by only applying reasoning parameters to compatible models.
Issue: "No module named 'openai_websearch_mcp'" Solution: Ensure you've installed the package correctly and your Python path includes the package location.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
🤖 Generated with Claude Code
🔥 Powered by OpenAI's Web Search API
🛠️ Built on the Model Context Protocol
Co-Authored-By: Claude noreply@anthropic.com