npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

chatgpt-webui-mcp

v0.1.5

Published

Standalone MCP server for ChatGPT Web UI via session token

Downloads

598

Readme


quick start

install from npm:

npm i -g chatgpt-webui-mcp

manual run:

CHATGPT_SESSION_TOKEN="your_token_here" chatgpt-webui-mcp

from source:

npm install
npm run build
CHATGPT_SESSION_TOKEN="your_token_here" node dist/index.js

important: this uses chatgpt's internal webui api with a session cookie. for personal/local tinkering only - not affiliated with openai.


overview

chatgpt-webui-mcp is a standalone MCP server that drives chatgpt.com via camofox (UI automation).

it is built for long-running tasks (gpt-5.2 pro runs that take 1h+), deep research, and image generation mode.


getting your session token

  1. open https://chatgpt.com and log in
  2. open devtools
  3. application -> cookies -> https://chatgpt.com
  4. copy the value of __Secure-next-auth.session-token

configuration

because this server uses stdio or sse, you configure it as a local command (or remote url) and pass the token via env.

mcp client config (claude desktop, opencode, etc)

{
  "mcpServers": {
    "chatgpt-webui": {
      "command": "node",
      "args": ["/absolute/path/to/chatgpt-webui-mcp/dist/index.js"],
       "timeout": 7200000,
        "env": {
          "CHATGPT_SESSION_TOKEN_FILE": "/path/to/session-token.txt",
          "CHATGPT_BROWSER_BASE_URL": "http://127.0.0.1:9377",
          "CHATGPT_WAIT_TIMEOUT_MS": "7200000"
        }
     }
   }
 }

legacy CHATGPT_CAMOFOX_* env vars are still supported for compatibility. CHATGPT_TRANSPORT=httpcloak is optional fallback mode for advanced/debug scenarios. when model/thinking are omitted, requests default to gpt-5-2 (auto), not pro.


opencode workflow (the natural language style)

if you want to type commands like:

  • with chatgpt webui on gpt 5.2 pro extended thinking: <prompt>
  • do deepresearch with chatgpt webui on <topic>

use this tool:

| tool | what it does | |------|--------------| | chatgpt_webui_command | parses your sentence into the right call and runs it |

example:

{
  "name": "chatgpt_webui_command",
  "arguments": {
    "command": "with chatgpt webui on gpt 5.2 pro extended thinking: write a 1-page memo about X",
    "mode": "auto"
  }
}

long runs (recommended)

use the unified tools:

| tool | description | |------|-------------| | chatgpt_webui_prompt | main tool. mode=auto chooses wait vs background | | chatgpt_webui_run | check/wait for background runs (run_id) |

why: deep research and gpt-5.2 pro can take a long time and may exceed a single client timeout. mode=auto returns a run_id for long jobs.


image generation

set create_image=true to switch chatgpt into image generation mode before sending the prompt.

notes:

  • image_urls is best-effort (derived from page links + visited urls) and may be empty depending on how chatgpt renders images in the webui.
  • fallback screenshot output is returned in image_data_url (not image_urls) when enabled and size-capped.
  • enable fallback with CHATGPT_IMAGE_SCREENSHOT_FALLBACK=1.
  • cap fallback size with CHATGPT_IMAGE_SCREENSHOT_MAX_BYTES (default 2097152, 2 MiB).
  • for reliable retrieval, you can also use the conversation_id and open the chatgpt UI.

self-test

# env
CHATGPT_SESSION_TOKEN="your_token_here" npm run self-test

# cli flag
npm run self-test -- --token "your_token_here"

# file
echo "your_token_here" > ~/.config/chatgpt-webui-mcp/session-token.txt
npm run self-test -- --token-file ~/.config/chatgpt-webui-mcp/session-token.txt

remote deployment over tailscale (optional)

if you want background runs to survive for a long time, run this server as an always-on SSE service.

  1. copy templates from this repo:
  • deploy/systemd/chatgpt-webui-mcp.env.example
  • deploy/systemd/chatgpt-webui-mcp-sse.sh
  • deploy/systemd/chatgpt-webui-mcp.service
  1. install and enable service (user service):
mkdir -p ~/.config ~/.config/systemd/user ~/.local/bin ~/.local/share/chatgpt-webui-mcp
cp deploy/systemd/chatgpt-webui-mcp.env.example ~/.config/chatgpt-webui-mcp.env
cp deploy/systemd/chatgpt-webui-mcp-sse.sh ~/.local/bin/chatgpt-webui-mcp-sse.sh
cp deploy/systemd/chatgpt-webui-mcp.service ~/.config/systemd/user/chatgpt-webui-mcp.service
chmod 600 ~/.config/chatgpt-webui-mcp.env
chmod 755 ~/.local/bin/chatgpt-webui-mcp-sse.sh
systemctl --user daemon-reload
systemctl --user enable --now chatgpt-webui-mcp.service
  1. point opencode (cloud host) to the endpoint:
{
  "mcp": {
    "chatgpt-webui": {
      "type": "remote",
      "url": "http://<tailscale-ip>:8791/sse",
      "enabled": true,
      "timeout": 7200000,
      "oauth": false
    }
  }
}

tools

| tool | description | |------|-------------| | chatgpt_webui_session | validate token and return session payload | | chatgpt_webui_models | list available models | | chatgpt_webui_command | natural-language command wrapper | | chatgpt_webui_prompt | unified prompt tool (wait/background) | | chatgpt_webui_run | check/wait for background runs | | chatgpt_webui_ask | direct wait-style prompt tool (legacy/simple) |


project structure

chatgpt-webui-mcp/
├── deploy/
│   └── systemd/
│       ├── chatgpt-webui-mcp.env.example
│       ├── chatgpt-webui-mcp-sse.sh
│       └── chatgpt-webui-mcp.service
├── src/
│   ├── index.ts               # MCP server
│   └── chatgpt-webui-client.ts # WebUI automation client
├── package.json
├── tsconfig.json
├── .env.example
├── .gitignore
├── LICENSE
├── INSTALL.md
└── README.md

license

mit


author

Microck