Skip to main content
Glama

o3-search MCP

by yoshiko-pg

o3-search-mcp

An MCP server that brings the power of OpenAI's o3 model to your AI agents, enabling them to perform intelligent web searches with natural language queries.

Usage Examples

Once installed, your AI agent can use the o3-search tool to perform web searches. For instance, try giving instructions like this:

🐛 When debugging gets tough

Use "ask o3" to find solutions from GitHub issues and Stack Overflow:

"I'm getting a 'Module not found' error in Next.js 14. Ask o3 for recent solutions" "Debug this WebSocket connection issue. Try asking o3 for help"

🧩 When tackling complex tasks

Add "consult o3 if you get stuck" to your requests:

"Implement a distributed caching system with Redis. If you encounter difficulties, consult o3" "Create a real-time collaborative editor. Ask o3 for help if you get stuck"

📚 For latest library info and migration guides

Stay up-to-date with "ask o3":

"How do I migrate from React Router v5 to v6? Ask o3 for the latest migration guide" "What's the current best practice for state management in React? Ask o3 for recent recommendations"

When you make a request to your coding agent, it can autonomously consult o3 by using the MCP interface to exchange natural language queries and responses. Your agent and o3 work together in real time to help you solve problems.

Installation

Claude Code:

$ claude mcp add o3 -s user \ -e OPENAI_API_KEY=your-api-key \ -e SEARCH_CONTEXT_SIZE=medium \ -e REASONING_EFFORT=medium \ -e OPENAI_API_TIMEOUT=60000 \ -e OPENAI_MAX_RETRIES=3 \ -- npx o3-search-mcp

json:

{ "mcpServers": { "o3-search": { "command": "npx", "args": ["o3-search-mcp"], "env": { "OPENAI_API_KEY": "your-api-key", // Optional: low, medium, high (default: medium) "SEARCH_CONTEXT_SIZE": "medium", "REASONING_EFFORT": "medium", // Optional: API timeout in milliseconds (default: 60000) "OPENAI_API_TIMEOUT": "60000", // Optional: Maximum retry attempts (default: 3) "OPENAI_MAX_RETRIES": "3" } } } }

Local Development Setup

If you want to download and run the code locally:

# setup git clone git@github.com:yoshiko-pg/o3-search-mcp.git cd o3-search-mcp pnpm install pnpm build

Claude Code:

$ claude mcp add o3 -s user \ -e OPENAI_API_KEY=your-api-key \ -e SEARCH_CONTEXT_SIZE=medium \ -e REASONING_EFFORT=medium \ -e OPENAI_API_TIMEOUT=60000 \ -e OPENAI_MAX_RETRIES=3 \ -- node /path/to/o3-search-mcp/build/index.js

json:

{ "mcpServers": { "o3-search": { "command": "node", "args": ["/path/to/o3-search-mcp/build/index.js"], "env": { "OPENAI_API_KEY": "your-api-key", // Optional: low, medium, high (default: medium) "SEARCH_CONTEXT_SIZE": "medium", "REASONING_EFFORT": "medium", // Optional: API timeout in milliseconds (default: 60000) "OPENAI_API_TIMEOUT": "60000", // Optional: Maximum retry attempts (default: 3) "OPENAI_MAX_RETRIES": "3" } } } }

Configuration

Environment Variables

  • OPENAI_API_KEY (required): Your OpenAI API key
  • SEARCH_CONTEXT_SIZE (optional): Controls the search context size
    • Values: low, medium, high
    • Default: medium
  • REASONING_EFFORT (optional): Controls the reasoning effort level
    • Values: low, medium, high
    • Default: medium
  • OPENAI_API_TIMEOUT (optional): API request timeout in milliseconds
    • Default: 60000 (60 seconds)
    • Example: 120000 for 2 minutes
  • OPENAI_MAX_RETRIES (optional): Maximum number of retry attempts for failed requests
    • Default: 3
    • The SDK automatically retries on rate limits (429), server errors (5xx), and connection errors
Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

An MCP server that enables web search capabilities using OpenAI's o3 model, allowing AI assistants to perform text-based web searches and return AI-powered results.

  1. Usage Examples
    1. 🐛 When debugging gets tough
    2. 🧩 When tackling complex tasks
    3. 📚 For latest library info and migration guides
  2. Installation
    1. Using npx (Recommended)
    2. Local Development Setup
  3. Configuration
    1. Environment Variables

Related MCP Servers

  • A
    security
    F
    license
    A
    quality
    An MCP (Model Context Protocol) server that provides Google search capabilities and webpage content analysis tools. This server enables AI models to perform Google searches and analyze webpage content programmatically.
    Last updated -
    3
    167
    52
    TypeScript
  • -
    security
    F
    license
    -
    quality
    An MCP server that integrates real-time web search capabilities into AI assistants using the Exa API, providing both basic and advanced search functionality with formatted markdown results.
    Last updated -
    119
    Python
    • Linux
    • Apple
  • -
    security
    F
    license
    -
    quality
    An MCP server that enables AI models to search the web using OpenAI's 4o-mini Search model, allowing access to up-to-date information for just a few cents per search.
    Last updated -
    1
    JavaScript
    • Apple
    • Linux
  • A
    security
    A
    license
    A
    quality
    An MCP server that allows users to efficiently search and reference user-configured documents through document listing, grep searching, semantic searching with OpenAI Embeddings, and full document retrieval.
    Last updated -
    4
    3
    Python
    MIT License
    • Apple
    • Linux

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yoshiko-pg/o3-search-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server