openai_web_search
Perform web searches using AI reasoning models to answer queries, supporting quick iterations or deep research with configurable reasoning effort and context size.
Instructions
OpenAI Web Search with reasoning models.
For quick multi-round searches: Use 'gpt-5-mini' with reasoning_effort='low' for fast iterations.
For deep research: Use 'gpt-5' with reasoning_effort='medium' or 'high'. The result is already multi-round reasoned, so agents don't need continuous iterations.
Supports: gpt-4o (no reasoning), gpt-5/gpt-5-mini/gpt-5-nano, o3/o4-mini (with reasoning).
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| input | Yes | The search query or question to search for | |
| model | No | AI model to use. Defaults to OPENAI_DEFAULT_MODEL env var or gpt-5-mini | |
| reasoning_effort | No | Reasoning effort level for supported models (gpt-5, o3, o4-mini). Default: low for gpt-5-mini, medium for others | |
| type | No | Web search API version to use | web_search_preview |
| search_context_size | No | Amount of context to include in search results | medium |
| user_location | No | Optional user location for localized search results |
Implementation Reference
- src/openai_websearch_mcp/server.py:33-79 (handler)Main handler function that implements the 'openai_web_search' tool. It reads environment variables, determines reasoning effort based on model, constructs the API request with web search tools and optional user location, and calls OpenAI's responses.create to get the search result.def openai_web_search( input: Annotated[str, Field(description="The search query or question to search for")], model: Annotated[Optional[Literal["gpt-4o", "gpt-4o-mini", "gpt-5", "gpt-5-mini", "gpt-5-nano", "o3", "o4-mini"]], Field(description="AI model to use. Defaults to OPENAI_DEFAULT_MODEL env var or gpt-5-mini")] = None, reasoning_effort: Annotated[Optional[Literal["low", "medium", "high", "minimal"]], Field(description="Reasoning effort level for supported models (gpt-5, o3, o4-mini). Default: low for gpt-5-mini, medium for others")] = None, type: Annotated[Literal["web_search_preview", "web_search_preview_2025_03_11"], Field(description="Web search API version to use")] = "web_search_preview", search_context_size: Annotated[Literal["low", "medium", "high"], Field(description="Amount of context to include in search results")] = "medium", user_location: Annotated[Optional[UserLocation], Field(description="Optional user location for localized search results")] = None, ) -> str: # 从环境变量读取默认模型,如果没有则使用 gpt-5-mini if model is None: model = os.getenv("OPENAI_DEFAULT_MODEL", "gpt-5-mini") client = OpenAI() # 判断是否为推理模型 reasoning_models = ["gpt-5", "gpt-5-mini", "gpt-5-nano", "o3", "o4-mini"] # 构建请求参数 request_params = { "model": model, "tools": [ { "type": type, "search_context_size": search_context_size, "user_location": user_location.model_dump() if user_location else None, } ], "input": input, } # 对推理模型设置智能默认值 if model in reasoning_models: if reasoning_effort is None: # gpt-5-mini 默认使用 low,其他推理模型默认 medium if model == "gpt-5-mini": reasoning_effort = "low" # 快速搜索 else: reasoning_effort = "medium" # 深度研究 request_params["reasoning"] = {"effort": reasoning_effort} response = client.responses.create(**request_params) return response.output_text
- src/openai_websearch_mcp/server.py:22-32 (registration)Tool registration decorator @mcp.tool with name='openai_web_search', detailed description, and input schema defined via Annotated parameters including model, reasoning_effort, type, search_context_size, and user_location.@mcp.tool( name="openai_web_search", description="""OpenAI Web Search with reasoning models. For quick multi-round searches: Use 'gpt-5-mini' with reasoning_effort='low' for fast iterations. For deep research: Use 'gpt-5' with reasoning_effort='medium' or 'high'. The result is already multi-round reasoned, so agents don't need continuous iterations. Supports: gpt-4o (no reasoning), gpt-5/gpt-5-mini/gpt-5-nano, o3/o4-mini (with reasoning).""", )
- Pydantic model UserLocation used for optional user_location parameter to provide localized search results.class UserLocation(BaseModel): type: Literal["approximate"] = "approximate" city: str country: str = None region: str = None timezone: TimeZoneName