npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

o3-mcp

v0.2.1

Published

MCP server for OpenAI o3 model integration

Readme

OpenAI o3 MCP Server

A Model Context Protocol (MCP) server that provides seamless access to OpenAI's o3, o3-mini, and o3-pro reasoning models. This server enables Claude Desktop, Claude Code, and other MCP clients to interact with OpenAI's latest reasoning models through a standardized interface.

Features

  • 🚀 Triple Model Support: Access o3, o3-mini, and o3-pro models
  • 💬 Flexible Interfaces: Simple prompt-based and multi-turn chat capabilities
  • 🔧 Easy Integration: Works with Claude Desktop, Claude Code, and any MCP-compatible client
  • Optimized Performance: Efficient handling of API calls with proper error management
  • 🛠️ Developer Friendly: Comprehensive logging and debugging support

Prerequisites

  • Node.js v18 or higher
  • npm (Node Package Manager)
  • OpenAI API key with access to o3 models
  • Claude Desktop, Claude Code, or another MCP client

Installation

Option 1: NPX (Recommended - No Installation Required)

The easiest way to use the o3 MCP server is with npx:

npx o3-mcp

This will automatically download and run the latest version without any installation. You'll need to set your OpenAI API key as an environment variable:

export OPENAI_API_KEY="your-openai-api-key-here"
npx o3-mcp

Option 2: Global Installation

npm install -g o3-mcp
export OPENAI_API_KEY="your-openai-api-key-here"
o3-mcp

Option 3: Local Development Setup

  1. Clone the repository:
git clone https://github.com/GitMaxd/o3-mcp.git
cd o3-mcp
  1. Install dependencies:
npm install
  1. Configure your OpenAI API key:
# Create a .env file
echo "OPENAI_API_KEY=your-openai-api-key-here" > .env
# Edit .env and add your actual OpenAI API key

Configuration

Claude Desktop

Add the following to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Linux: ~/.config/Claude/claude_desktop_config.json

Option 1: Using NPX (Recommended)

{
  "mcpServers": {
    "o3": {
      "command": "npx",
      "args": ["o3-mcp"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key-here"
      }
    }
  }
}

Option 2: Using Local Installation

{
  "mcpServers": {
    "o3": {
      "command": "node",
      "args": ["/absolute/path/to/o3-mcp/index.js"],
      "cwd": "/absolute/path/to/o3-mcp"
    }
  }
}

Important: The cwd parameter is required when using a .env file. It tells the server where to find the .env file.

Note: You may need to use the full path to Node.js if it's not in your system PATH:

{
  "mcpServers": {
    "o3": {
      "command": "/Users/username/.nvm/versions/node/v22.14.0/bin/node",
      "args": ["/Users/username/o3-mcp/index.js"],
      "cwd": "/Users/username/o3-mcp"
    }
  }
}

API Key Configuration Options

Option 1: Using .env file (Recommended for development)

  • Requires cwd parameter in configuration
  • Keep your API key in .env file
  • Easier to update without modifying config

Option 2: Direct configuration (Recommended for simplicity)

{
  "mcpServers": {
    "o3": {
      "command": "node",
      "args": ["/absolute/path/to/o3-mcp/index.js"],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
      }
    }
  }
}

With this approach, you don't need the cwd parameter or .env file.

Claude Code

Claude Code uses the same configuration file as Claude Desktop. See the Claude Desktop section above.

Other MCP Clients

For Cline, Continue, or other MCP-compatible clients, refer to their documentation for adding MCP servers. The command remains the same:

node /absolute/path/to/o3-mcp/index.js

Available Tools

1. o3_prompt

Simple, single-turn interaction with o3 models.

Parameters:

  • prompt (required): Your question or prompt
  • model (optional): "o3", "o3-mini", or "o3-pro" (default: "o3")
  • developer_message (optional): System context (default: "You are a helpful assistant.")

2. o3_chat

Multi-turn conversations with context preservation.

Parameters:

  • messages (required): Array of message objects with role and content
  • model (optional): "o3", "o3-mini", or "o3-pro" (default: "o3")
  • max_tokens (optional): Maximum response tokens (default: 1000)

Usage Examples

Basic Prompts

Get a quick fact:

Use o3 to tell me an interesting fact about black holes.

Code explanation:

Ask o3 to explain how async/await works in JavaScript.

Advanced analysis with o3-pro:

Ask o3-pro to analyze the computational complexity of quantum algorithms for factoring large numbers.

Advanced Code Collaboration

Algorithm Analysis:

I have this sorting algorithm. Can you ask o3 to analyze its time complexity and suggest optimizations?

[paste your code]

Architecture Review:

Create a dialogue with o3 about the pros and cons of microservices vs monolithic architecture for a startup.

Creative Use Cases

1. Pair Programming with o3

Let's have o3 help us implement a binary search tree. First, ask it for the basic structure, then we'll iterate on adding balancing logic.

2. Code Review Assistant

I'm going to show you my React component. Can you have o3 review it for performance issues and best practices?

3. Learning Complex Concepts

Create a multi-turn conversation with o3 where it explains quantum computing concepts using programming analogies.

4. Debugging Partner

My recursive function is causing a stack overflow. Let's debug it together with o3's help.

5. API Design Consultation

I need to design a REST API for a social media app. Can you discuss with o3 about best practices for endpoint design, authentication, and rate limiting?

Multi-Turn Conversation Example

Create a dialogue with o3 about implementing a real-time collaborative editor:
1. First, ask about the architecture
2. Then dive into conflict resolution strategies
3. Finally, discuss scaling considerations

Advanced Features

Model Selection

  • o3: Best for complex reasoning, code generation, and detailed analysis
  • o3-mini: 93% more cost-effective, great for simpler tasks and rapid iteration
  • o3-pro: Most advanced model with enhanced capabilities for the most demanding tasks

Error Handling

The server includes comprehensive error handling:

  • Invalid API responses are caught and reported
  • Network errors are gracefully handled
  • Detailed logging helps with debugging

Troubleshooting

Common Issues

  1. "MCP Server Failed to Connect"

    • Missing cwd parameter: If using .env file, you MUST include "cwd": "/path/to/openai-o3-tool"
    • Wrong Node.js path: Use full path if node isn't in PATH
    • Solution: Use Option 2 configuration with API key in env object for simplicity
  2. "API Key Not Found"

    • The server can't find your .env file because it's running from wrong directory
    • Fix: Add cwd parameter or include API key directly in config
  3. "No output from o3_chat"

    • Ensure you include a system message
    • Check message formatting (proper JSON structure)
  4. "Model not found"

    • Confirm you have access to o3/o3-mini models in your OpenAI account
    • Check for typos in model names

Debug Mode

The server logs detailed information to stderr. To see logs when running standalone:

node index.js 2> debug.log

Performance Tips

  1. Use o3-mini for rapid prototyping and testing
  2. Batch related questions into multi-turn conversations
  3. Be specific in your prompts for better responses
  4. Set appropriate max_tokens limits to control costs

Security Considerations

  • Never commit your .env file or API keys
  • Use environment variables for all sensitive data
  • Consider implementing rate limiting for production use
  • Regularly rotate API keys

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • OpenAI for the o3 and o3-mini models
  • Anthropic for the MCP protocol specification
  • The MCP community for tooling and support

Created by @GitMaxd