o3-search-mcp

o3-search-mcpは、効率的な検索機能を提供するJavaScriptライブラリです。ユーザーがデータを迅速に検索できるように設計されており、カスタマイズ可能なフィルタリング機能や、リアルタイムの検索結果を提供します。特に大規模データセットにおいて、そのパフォーマンスが際立ちます。

GitHubスター

269

ユーザー評価

未評価

お気に入り

0

閲覧数

27

フォーク

45

イシュー

1

README
o3-search-mcp (gpt-5, o4-mini support)

MCP server that enables the use of OpenAI's high-end models and their powerful web search capabilities.
By registering it with any AI coding agent, the agent can autonomously consult with OpenAI models to solve complex problems.

MseeP.ai Security Assessment Badge o3-search MCP server
Use Cases

(Although called o3 to match the MCP name, you can specify gpt-5 or o4-mini via env for the model to use)

🐛 When you're stuck debugging

o3's web search can scan a wide range of sources, including GitHub issues and Stack Overflow, significantly increasing the chances of resolving niche problems. Example prompts:

> I'm getting the following error on startup, please fix it. If it's too difficult, ask o3.
> [Paste error message here]
> The WebSocket connection isn't working. Please debug it. If you don't know how, ask o3.
📚 When you want to reference the latest library information

You can get answers from the powerful web search even when there's no well-organized documentation. Example prompts:

> I want to upgrade this library to v2. Proceed while consulting with o3.
> I was told this option for this library doesn't exist. It might have been removed. Ask o3 what to specify instead and replace it.
🧩 When tackling complex tasks

In addition to search, you can also use it as a sounding board for design. Example prompts:

> I want to create a collaborative editor, so please design it. Also, ask o3 for a design review and discuss if necessary.

Also, since it's provided as an MCP server, the AI agent may decide on its own to talk to o3 when it deems it necessary, without any instructions from you. This will dramatically expand the range of problems it can solve on its own!

Installation
npx (Recommended)

Claude Code:

$ claude mcp add o3 \
	-s user \  # If you omit this line, it will be installed in the project scope
	-e OPENAI_MODEL=o3 \ # o4-mini, gpt-5 also available
	-e OPENAI_API_KEY=your-api-key \
	-e SEARCH_CONTEXT_SIZE=medium \
	-e REASONING_EFFORT=medium \
	-e OPENAI_API_TIMEOUT=300000 \
	-e OPENAI_MAX_RETRIES=3 \
	-- npx o3-search-mcp

json:

{
  "mcpServers": {
    "o3-search": {
      "command": "npx",
      "args": ["o3-search-mcp"],
      "env": {
        "OPENAI_API_KEY": "your-api-key",
        // Optional: o3, o4-mini, gpt-5 (default: o3)
        "OPENAI_MODEL": "o3",
        // Optional: low, medium, high (default: medium)
        "SEARCH_CONTEXT_SIZE": "medium",
        "REASONING_EFFORT": "medium",
        // Optional: API timeout in milliseconds (default: 300000)
        "OPENAI_API_TIMEOUT": "300000",
        // Optional: Maximum number of retries (default: 3)
        "OPENAI_MAX_RETRIES": "3"
      }
    }
  }
}
Local Setup

If you want to download the code and run it locally:

git clone git@github.com:yoshiko-pg/o3-search-mcp.git
cd o3-search-mcp
pnpm install
pnpm build

Claude Code:

$ claude mcp add o3 \
	-s user \  # If you omit this line, it will be installed in the project scope
	-e OPENAI_MODEL=o3 \ # o4-mini, gpt-5 also available
	-e OPENAI_API_KEY=your-api-key \
	-e OPENAI_MODEL=o3 \
	-e SEARCH_CONTEXT_SIZE=medium \
	-e REASONING_EFFORT=medium \
	-e OPENAI_API_TIMEOUT=300000 \
	-e OPENAI_MAX_RETRIES=3 \
	-- node /path/to/o3-search-mcp/build/index.js

json:

{
  "mcpServers": {
    "o3-search": {
      "command": "node",
      "args": ["/path/to/o3-search-mcp/build/index.js"],
      "env": {
        "OPENAI_API_KEY": "your-api-key",
        // Optional: o3, o4-mini, gpt-5 (default: o3)
        "OPENAI_MODEL": "o3",
        // Optional: low, medium, high (default: medium)
        "SEARCH_CONTEXT_SIZE": "medium",
        "REASONING_EFFORT": "medium",
        // Optional: API timeout in milliseconds (default: 300000)
        "OPENAI_API_TIMEOUT": "300000",
        // Optional: Maximum number of retries (default: 3)
        "OPENAI_MAX_RETRIES": "3"
      }
    }
  }
}
Environment Variables
Environment Variable Options Default Description
OPENAI_API_KEY Required - OpenAI API Key
OPENAI_MODEL Optional o3 Model to use
Values: o3, o4-mini, gpt-5
SEARCH_CONTEXT_SIZE Optional medium Controls the search context size
Values: low, medium, high
REASONING_EFFORT Optional medium Controls the reasoning effort level
Values: low, medium, high
OPENAI_API_TIMEOUT Optional 300000 API request timeout in milliseconds
Example: 300000 for 5 minutes
OPENAI_MAX_RETRIES Optional 3 Maximum number of retries for failed requests
The SDK automatically retries on rate limits (429), server errors (5xx), and connection errors
Notes

To use the o3 model from the OpenAI API, you need to either raise your tier to 4 or verify your organization.
If you register an API key that is not yet enabled for o3 with this MCP, calls will result in an error.
Reference: https://help.openai.com/en/articles/10362446-api-access-to-o1-o3-and-o4-models