o3-mcp
v0.2.1
Published
MCP server for OpenAI o3 model integration
Maintainers
Readme
OpenAI o3 MCP Server
A Model Context Protocol (MCP) server that provides seamless access to OpenAI's o3, o3-mini, and o3-pro reasoning models. This server enables Claude Desktop, Claude Code, and other MCP clients to interact with OpenAI's latest reasoning models through a standardized interface.
Features
- 🚀 Triple Model Support: Access o3, o3-mini, and o3-pro models
- 💬 Flexible Interfaces: Simple prompt-based and multi-turn chat capabilities
- 🔧 Easy Integration: Works with Claude Desktop, Claude Code, and any MCP-compatible client
- ⚡ Optimized Performance: Efficient handling of API calls with proper error management
- 🛠️ Developer Friendly: Comprehensive logging and debugging support
Prerequisites
- Node.js v18 or higher
- npm (Node Package Manager)
- OpenAI API key with access to o3 models
- Claude Desktop, Claude Code, or another MCP client
Installation
Option 1: NPX (Recommended - No Installation Required)
The easiest way to use the o3 MCP server is with npx:
npx o3-mcpThis will automatically download and run the latest version without any installation. You'll need to set your OpenAI API key as an environment variable:
export OPENAI_API_KEY="your-openai-api-key-here"
npx o3-mcpOption 2: Global Installation
npm install -g o3-mcp
export OPENAI_API_KEY="your-openai-api-key-here"
o3-mcpOption 3: Local Development Setup
- Clone the repository:
git clone https://github.com/GitMaxd/o3-mcp.git
cd o3-mcp- Install dependencies:
npm install- Configure your OpenAI API key:
# Create a .env file
echo "OPENAI_API_KEY=your-openai-api-key-here" > .env
# Edit .env and add your actual OpenAI API keyConfiguration
Claude Desktop
Add the following to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Linux: ~/.config/Claude/claude_desktop_config.json
Option 1: Using NPX (Recommended)
{
"mcpServers": {
"o3": {
"command": "npx",
"args": ["o3-mcp"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here"
}
}
}
}Option 2: Using Local Installation
{
"mcpServers": {
"o3": {
"command": "node",
"args": ["/absolute/path/to/o3-mcp/index.js"],
"cwd": "/absolute/path/to/o3-mcp"
}
}
}Important: The cwd parameter is required when using a .env file. It tells the server where to find the .env file.
Note: You may need to use the full path to Node.js if it's not in your system PATH:
{
"mcpServers": {
"o3": {
"command": "/Users/username/.nvm/versions/node/v22.14.0/bin/node",
"args": ["/Users/username/o3-mcp/index.js"],
"cwd": "/Users/username/o3-mcp"
}
}
}API Key Configuration Options
Option 1: Using .env file (Recommended for development)
- Requires
cwdparameter in configuration - Keep your API key in
.envfile - Easier to update without modifying config
Option 2: Direct configuration (Recommended for simplicity)
{
"mcpServers": {
"o3": {
"command": "node",
"args": ["/absolute/path/to/o3-mcp/index.js"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}With this approach, you don't need the cwd parameter or .env file.
Claude Code
Claude Code uses the same configuration file as Claude Desktop. See the Claude Desktop section above.
Other MCP Clients
For Cline, Continue, or other MCP-compatible clients, refer to their documentation for adding MCP servers. The command remains the same:
node /absolute/path/to/o3-mcp/index.jsAvailable Tools
1. o3_prompt
Simple, single-turn interaction with o3 models.
Parameters:
prompt(required): Your question or promptmodel(optional): "o3", "o3-mini", or "o3-pro" (default: "o3")developer_message(optional): System context (default: "You are a helpful assistant.")
2. o3_chat
Multi-turn conversations with context preservation.
Parameters:
messages(required): Array of message objects withroleandcontentmodel(optional): "o3", "o3-mini", or "o3-pro" (default: "o3")max_tokens(optional): Maximum response tokens (default: 1000)
Usage Examples
Basic Prompts
Get a quick fact:
Use o3 to tell me an interesting fact about black holes.Code explanation:
Ask o3 to explain how async/await works in JavaScript.Advanced analysis with o3-pro:
Ask o3-pro to analyze the computational complexity of quantum algorithms for factoring large numbers.Advanced Code Collaboration
Algorithm Analysis:
I have this sorting algorithm. Can you ask o3 to analyze its time complexity and suggest optimizations?
[paste your code]Architecture Review:
Create a dialogue with o3 about the pros and cons of microservices vs monolithic architecture for a startup.Creative Use Cases
1. Pair Programming with o3
Let's have o3 help us implement a binary search tree. First, ask it for the basic structure, then we'll iterate on adding balancing logic.2. Code Review Assistant
I'm going to show you my React component. Can you have o3 review it for performance issues and best practices?3. Learning Complex Concepts
Create a multi-turn conversation with o3 where it explains quantum computing concepts using programming analogies.4. Debugging Partner
My recursive function is causing a stack overflow. Let's debug it together with o3's help.5. API Design Consultation
I need to design a REST API for a social media app. Can you discuss with o3 about best practices for endpoint design, authentication, and rate limiting?Multi-Turn Conversation Example
Create a dialogue with o3 about implementing a real-time collaborative editor:
1. First, ask about the architecture
2. Then dive into conflict resolution strategies
3. Finally, discuss scaling considerationsAdvanced Features
Model Selection
- o3: Best for complex reasoning, code generation, and detailed analysis
- o3-mini: 93% more cost-effective, great for simpler tasks and rapid iteration
- o3-pro: Most advanced model with enhanced capabilities for the most demanding tasks
Error Handling
The server includes comprehensive error handling:
- Invalid API responses are caught and reported
- Network errors are gracefully handled
- Detailed logging helps with debugging
Troubleshooting
Common Issues
"MCP Server Failed to Connect"
- Missing cwd parameter: If using
.envfile, you MUST include"cwd": "/path/to/openai-o3-tool" - Wrong Node.js path: Use full path if
nodeisn't in PATH - Solution: Use Option 2 configuration with API key in
envobject for simplicity
- Missing cwd parameter: If using
"API Key Not Found"
- The server can't find your
.envfile because it's running from wrong directory - Fix: Add
cwdparameter or include API key directly in config
- The server can't find your
"No output from o3_chat"
- Ensure you include a system message
- Check message formatting (proper JSON structure)
"Model not found"
- Confirm you have access to o3/o3-mini models in your OpenAI account
- Check for typos in model names
Debug Mode
The server logs detailed information to stderr. To see logs when running standalone:
node index.js 2> debug.logPerformance Tips
- Use
o3-minifor rapid prototyping and testing - Batch related questions into multi-turn conversations
- Be specific in your prompts for better responses
- Set appropriate
max_tokenslimits to control costs
Security Considerations
- Never commit your
.envfile or API keys - Use environment variables for all sensitive data
- Consider implementing rate limiting for production use
- Regularly rotate API keys
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- OpenAI for the o3 and o3-mini models
- Anthropic for the MCP protocol specification
- The MCP community for tooling and support
Created by @GitMaxd
