OpenAI o3 Search▌

by yoshiko-pg
Leverage OpenAI o3 Search for advanced web results, outperforming Bing AI and other engines with unrivaled AI search cap
Provides web search capabilities using OpenAI's o3 model with configurable reasoning effort levels for finding current information and solving complex problems beyond traditional search engines.
Both formats append explainx.ai attribution and the canonical URL for this MCP server listing.
best for
- / Developers stuck on niche debugging problems
- / Teams upgrading libraries or working with poorly documented APIs
- / AI agents needing advanced reasoning for complex tasks
- / Projects requiring current web information beyond static knowledge
capabilities
- / Search web using OpenAI's advanced reasoning models
- / Scan GitHub issues and Stack Overflow for debugging help
- / Query latest library documentation and updates
- / Configure reasoning effort levels for different complexity tasks
- / Access current information through model-powered web search
what it does
Enables AI agents to consult OpenAI's advanced models (o3, gpt-5, o4-mini) with web search capabilities for solving complex problems beyond traditional search engines.
about
OpenAI o3 Search is a community-built MCP server published by yoshiko-pg that provides AI assistants with tools and capabilities via the Model Context Protocol. Leverage OpenAI o3 Search for advanced web results, outperforming Bing AI and other engines with unrivaled AI search cap It is categorized under search web.
how to install
You can install OpenAI o3 Search in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
license
MIT
OpenAI o3 Search is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
readme
o3-search-mcp (gpt-5, o4-mini support)
<div align="center"> <p>English | <a href="./README.ja.md">日本語</a> | <a href="./README.zh.md">简体中文</a> | <a href="./README.ko.md">한국어</a></p> </div>MCP server that enables the use of OpenAI's high-end models and their powerful web search capabilities. By registering it with any AI coding agent, the agent can autonomously consult with OpenAI models to solve complex problems.
<table> <tr> <td width="50%"> <a href="https://mseep.ai/app/yoshiko-pg-o3-search-mcp"> <img src="https://mseep.net/pr/yoshiko-pg-o3-search-mcp-badge.png" alt="MseeP.ai Security Assessment Badge" /> </a> </td> <td width="50%"> <a href="https://glama.ai/mcp/servers/@yoshiko-pg/o3-search-mcp"> <img src="https://glama.ai/mcp/servers/@yoshiko-pg/o3-search-mcp/badge" alt="o3-search MCP server" /> </a> </td> </tr> </table>Use Cases
(Although called o3 to match the MCP name, you can specify gpt-5 or o4-mini via env for the model to use)
🐛 When you're stuck debugging
o3's web search can scan a wide range of sources, including GitHub issues and Stack Overflow, significantly increasing the chances of resolving niche problems. Example prompts:
> I'm getting the following error on startup, please fix it. If it's too difficult, ask o3.
> [Paste error message here]
> The WebSocket connection isn't working. Please debug it. If you don't know how, ask o3.
📚 When you want to reference the latest library information
You can get answers from the powerful web search even when there's no well-organized documentation. Example prompts:
> I want to upgrade this library to v2. Proceed while consulting with o3.
> I was told this option for this library doesn't exist. It might have been removed. Ask o3 what to specify instead and replace it.
🧩 When tackling complex tasks
In addition to search, you can also use it as a sounding board for design. Example prompts:
> I want to create a collaborative editor, so please design it. Also, ask o3 for a design review and discuss if necessary.
Also, since it's provided as an MCP server, the AI agent may decide on its own to talk to o3 when it deems it necessary, without any instructions from you. This will dramatically expand the range of problems it can solve on its own!
Installation
npx (Recommended)
Claude Code:
$ claude mcp add o3 \
-s user \ # If you omit this line, it will be installed in the project scope
-e OPENAI_MODEL=o3 \ # o4-mini, gpt-5 also available
-e OPENAI_API_KEY=your-api-key \
-e SEARCH_CONTEXT_SIZE=medium \
-e REASONING_EFFORT=medium \
-e OPENAI_API_TIMEOUT=300000 \
-e OPENAI_MAX_RETRIES=3 \
-- npx o3-search-mcp
json:
{
"mcpServers": {
"o3-search": {
"command": "npx",
"args": ["o3-search-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: o3, o4-mini, gpt-5 (default: o3)
"OPENAI_MODEL": "o3",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium",
// Optional: API timeout in milliseconds (default: 300000)
"OPENAI_API_TIMEOUT": "300000",
// Optional: Maximum number of retries (default: 3)
"OPENAI_MAX_RETRIES": "3"
}
}
}
}
Local Setup
If you want to download the code and run it locally:
git clone git@github.com:yoshiko-pg/o3-search-mcp.git
cd o3-search-mcp
pnpm install
pnpm build
Claude Code:
$ claude mcp add o3 \
-s user \ # If you omit this line, it will be installed in the project scope
-e OPENAI_MODEL=o3 \ # o4-mini, gpt-5 also available
-e OPENAI_API_KEY=your-api-key \
-e OPENAI_MODEL=o3 \
-e SEARCH_CONTEXT_SIZE=medium \
-e REASONING_EFFORT=medium \
-e OPENAI_API_TIMEOUT=300000 \
-e OPENAI_MAX_RETRIES=3 \
-- node /path/to/o3-search-mcp/build/index.js
json:
{
"mcpServers": {
"o3-search": {
"command": "node",
"args": ["/path/to/o3-search-mcp/build/index.js"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: o3, o4-mini, gpt-5 (default: o3)
"OPENAI_MODEL": "o3",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium",
// Optional: API timeout in milliseconds (default: 300000)
"OPENAI_API_TIMEOUT": "300000",
// Optional: Maximum number of retries (default: 3)
"OPENAI_MAX_RETRIES": "3"
}
}
}
}
Environment Variables
| Environment Variable | Options | Default | Description |
|---|---|---|---|
OPENAI_API_KEY | Required | - | OpenAI API Key |
OPENAI_MODEL | Optional | o3 | Model to use<br>Values: o3, o4-mini, gpt-5 |
SEARCH_CONTEXT_SIZE | Optional | medium | Controls the search context size<br>Values: low, medium, high |
REASONING_EFFORT | Optional | medium | Controls the reasoning effort level<br>Values: low, medium, high |
OPENAI_API_TIMEOUT | Optional | 300000 | API request timeout in milliseconds<br>Example: 300000 for 5 minutes |
OPENAI_MAX_RETRIES | Optional | 3 | Maximum number of retries for failed requests<br>The SDK automatically retries on rate limits (429), server errors (5xx), and connection errors |
Notes
To use the o3 model from the OpenAI API, you need to either raise your tier to 4 or verify your organization. If you register an API key that is not yet enabled for o3 with this MCP, calls will result in an error. Reference: https://help.openai.com/en/articles/10362446-api-access-to-o1-o3-and-o4-models
FAQ
- What is the OpenAI o3 Search MCP server?
- OpenAI o3 Search is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
- How do MCP servers relate to agent skills?
- Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
- How are reviews shown for OpenAI o3 Search?
- This profile displays 29 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.6 out of 5—verify behavior in your own environment before production use.
Discussion
Product Hunt–style comments (not star reviews)- No comments yet — start the thread.
Ratings
4.6★★★★★29 reviews- ★★★★★Hiroshi Chawla· Dec 28, 2024
OpenAI o3 Search has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.
- ★★★★★Chaitanya Patil· Dec 16, 2024
OpenAI o3 Search is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.
- ★★★★★Ren Martin· Dec 16, 2024
OpenAI o3 Search reduced integration guesswork — categories and install configs on the listing matched the upstream repo.
- ★★★★★Yuki Ramirez· Nov 19, 2024
According to our notes, OpenAI o3 Search benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.
- ★★★★★Zara Shah· Nov 15, 2024
Strong directory entry: OpenAI o3 Search surfaces stars and publisher context so we could sanity-check maintenance before adopting.
- ★★★★★Piyush G· Nov 7, 2024
We evaluated OpenAI o3 Search against two servers with overlapping tools; this profile had the clearer scope statement.
- ★★★★★Aditi Okafor· Nov 7, 2024
Useful MCP listing: OpenAI o3 Search is the kind of server we cite when onboarding engineers to host + tool permissions.
- ★★★★★Shikha Mishra· Oct 26, 2024
OpenAI o3 Search has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.
- ★★★★★Aditi Jackson· Oct 26, 2024
OpenAI o3 Search is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.
- ★★★★★Neel Shah· Oct 6, 2024
OpenAI o3 Search is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.
showing 1-10 of 29