o3-session-mcp
v0.0.1
Published
MCP server for session-based interactions with OpenAI o3 model
Maintainers
Readme
o3-session-mcp
An MCP (Model Context Protocol) server that provides session-based interactions with OpenAI's o3 model. This server allows you to create persistent conversation sessions with o3, leveraging its advanced web search capabilities while maintaining conversation context across multiple interactions.
Features
- Session Management: Create and manage multiple conversation sessions with unique IDs
- Context Preservation: Maintains full conversation history within each session
- Web Search Integration: Leverages o3's web search capabilities for up-to-date information
- Flexible Configuration: Customize search context size and reasoning effort levels
Installation
Using npx (Recommended)
Claude Code:
$ claude mcp add o3-session -s user \
-e OPENAI_API_KEY=your-api-key \
-e SEARCH_CONTEXT_SIZE=medium \
-e REASONING_EFFORT=medium \
-- npx o3-session-mcpjson:
{
"mcpServers": {
"o3-session": {
"command": "npx",
"args": ["o3-session-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium"
}
}
}
}Local Development Setup
If you want to download and run the code locally:
# setup
git clone [email protected]:inaridiy/o3-session-mcp.git
cd o3-session-mcp
pnpm install
pnpm buildClaude Code:
$ claude mcp add o3-session -s user \
-e OPENAI_API_KEY=your-api-key \
-e SEARCH_CONTEXT_SIZE=medium \
-e REASONING_EFFORT=medium \
-- node /path/to/o3-session-mcp/build/index.jsjson:
{
"mcpServers": {
"o3-session": {
"command": "node",
"args": ["/path/to/o3-session-mcp/build/index.js"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium"
}
}
}
}Available Tools
create-o3-session
Creates a new o3 session with advanced web search capabilities. Returns a session ID that can be used to continue the conversation.
Parameters:
initialPrompt(string, required): Initial prompt to start the conversation with o3metadata(object, optional): Optional metadata for the sessiontitle(string, optional): Optional title for the sessiondescription(string, optional): Optional description of the session purpose
Example:
await create_o3_session({
initialPrompt: "What are the latest developments in quantum computing?",
metadata: {
title: "Quantum Computing Research",
description: "Research session on recent quantum computing advances"
}
});send-o3-session
Send a message to an existing o3 session. Maintains conversation history for context-aware responses.
Parameters:
sessionId(string, required): The session ID returned from create-o3-sessionmessage(string, required): The message to send to the o3 session
Example:
await send_o3_session({
sessionId: "abc123def456...",
message: "Can you explain the implications for cryptography?"
});list-o3-sessions
List all active o3 sessions with their metadata and last accessed time.
Parameters: None
Example:
await list_o3_sessions();Usage Examples
Creating a Research Session
- Create a new session for researching a topic:
const result = await create_o3_session({
initialPrompt: "I need to understand the current state of AI regulation in the EU",
metadata: {
title: "EU AI Regulation Research",
description: "Comprehensive research on EU AI Act and related policies"
}
});
// Returns session ID and initial response- Continue the conversation with follow-up questions:
await send_o3_session({
sessionId: "your-session-id",
message: "What are the main compliance requirements for LLM providers?"
});- Ask for specific details:
await send_o3_session({
sessionId: "your-session-id",
message: "Can you find recent news about enforcement actions?"
});Technical Troubleshooting Session
- Start a troubleshooting session:
const result = await create_o3_session({
initialPrompt: "I'm getting a 'Module not found' error in my Next.js 14 app when importing from '@/components'",
metadata: {
title: "Next.js Import Error",
description: "Debugging module resolution issues"
}
});- Provide additional context:
await send_o3_session({
sessionId: "your-session-id",
message: "Here's my tsconfig.json: [paste config]. The error happens in production build only."
});Environment Variables
OPENAI_API_KEY(required): Your OpenAI API keySEARCH_CONTEXT_SIZE(optional): Controls the amount of search contextlow: Minimal search contextmedium: Balanced search context (default)high: Maximum search context
REASONING_EFFORT(optional): Controls the reasoning depthlow: Faster responses with basic reasoningmedium: Balanced speed and reasoning (default)high: Deeper reasoning for complex queries
Notes
- Sessions are stored in memory and will be lost when the server restarts
- For production use, consider implementing persistent storage (Redis, database, etc.)
- The o3 model uses web search to provide up-to-date information
- Each session maintains its own conversation history for context-aware responses
License
MIT
