multi-agent-room
v0.1.0
Published
A minimal Claude Code conversation viewer via HTTP proxy
Maintainers
Readme
multi-agent-room
A minimal Claude Code conversation viewer — runs as an HTTP proxy in front of the Anthropic API, captures every /v1/messages request/response, and shows the conversation (user / assistant / tool_use / tool_result / thinking) in a local web UI.
Inspired by cc-viewer; this is a stripped-down, single-file-per-concern version.
How it works
Claude Code ──HTTP──▶ multi-agent-room :7080 ──HTTPS──▶ api.anthropic.com
│
├─ records request + assembled SSE response
│ to ~/.mar/logs/<sessionId>.jsonl
│
└─ pushes new entries via SSE to the React UIInstall
git clone https://github.com/weiesky/multi-agent-room.git
cd multi-agent-room
npm install # runs `vite build` automatically via the prepare scriptUsage
Terminal A — start the server:
# After `npm install -g .` (or `npm link`):
mar
# Or, from source without a global install:
node bin/multi-agent-room.jsThe CLI is also installed as
multi-agent-roomfor backward compatibility, butmaris the canonical short command going forward.
You'll see:
✓ mar running at http://127.0.0.1:7080
In another terminal:
export ANTHROPIC_BASE_URL=http://127.0.0.1:7080
claude
Then open http://127.0.0.1:7080 to view conversations.Terminal B — point Claude Code at the proxy:
export ANTHROPIC_BASE_URL=http://127.0.0.1:7080
claudeBrowser: open http://127.0.0.1:7080. The current session shows up in the left list with a live tag and updates in real time.
⚠ If you also have
ANTHROPIC_BASE_URLset in~/.claude/settings.json, that takes precedence over the shellexport. Either remove it from settings.json, or edit it to point at this proxy.
Development
npm run dev # starts the server (:7080) and Vite dev server (:5173) togetherVite proxies /api, /events, and /v1 to the backend, so opening http://127.0.0.1:5173 gives you HMR with live data.
Environment variables
| Variable | Default | Description |
|---|---|---|
| PORT | 7080 | Server port. The server scans 7080–8000 and binds the first free port. PORT, if set, must fall in that range and acts as the start of the scan; out-of-range values are warned about and ignored. The actual bound port is printed in the startup banner. |
| MULTI_AGENT_ROOM_LOG_DIR | ~/.mar/logs | Where session JSONL files are written |
| MULTI_AGENT_ROOM_UPSTREAM | https://api.anthropic.com | Upstream Anthropic /v1/messages API to forward to |
| MULTI_AGENT_ROOM_OPENAI_UPSTREAM | (none) | Upstream host for the OpenAI Responses API used by Codex CLI. Falls back to OPENAI_UPSTREAM_URL → MULTI_AGENT_ROOM_UPSTREAM → https://api.openai.com. Required when codex's [model_providers.<name>].base_url points to a non-OpenAI host (e.g. a third-party provider like aicodewith / cch); otherwise the proxy will forward /chatgpt/v1/responses to api.openai.com and every request will fail. |
| MULTI_AGENT_ROOM_AGENTS_FILE | ~/.mar/agents.json | Path to the agents config file |
Agents (stage 1)
The same server hosts a small agent configuration store at ~/.mar/agents.json (next to logs/). Open the Agents tab in the web UI to manage it, or use the JSON API:
# List
curl http://127.0.0.1:7080/api/agents
# Create
curl -X POST http://127.0.0.1:7080/api/agents \
-H 'Content-Type: application/json' \
-d '{"name":"my-claude","type":"claude-code","workspace":"/Users/me/proj","env":{"FOO":"bar"}}'
# Get / patch / delete
curl http://127.0.0.1:7080/api/agents/my-claude
curl -X PATCH http://127.0.0.1:7080/api/agents/my-claude \
-H 'Content-Type: application/json' -d '{"workspace":"/new/path"}'
curl -X DELETE http://127.0.0.1:7080/api/agents/my-claudeEach agent record stores: name (unique, ^[A-Za-z0-9_-]{1,64}$), type (one of openclaw / claude-code / codex / hermes / gemini-cli), workspace (absolute path), env (string→string), and three free-form JSON fields: mcp, skills, ext_info. Stage 1 only stores; spawning agents and @name routing arrive in stage 2 — by convention, stage-2 spawn parameters live under ext_info.spawn = { command, args, cwd }.
The store atomically writes agents.json.tmp → rename and keeps a single-step agents.json.bak; corruption recovers from .bak, or archives the broken file as agents.json.broken-<ts>.
Don't run two
multi-agent-roomprocesses against the same~/.mar/— there's no inter-process lock. One server at a time.
What's stored
~/.mar/logs/<sessionKey>-<YYYYMMDD>-<seq>.jsonl — one JSON object per line, one line per /v1/messages request/response pair:
{
"id": "uuid",
"sessionId": "37a8eec1-20260428-001",
"ts": "2026-04-28T15:30:00.000Z",
"method": "POST",
"url": "/v1/messages",
"status": 200,
"durationMs": 1234,
"preflight": false,
"request": {
"headers": { /* x-api-key, authorization, cookies removed */ },
"body": { "model": "...", "messages": [...], "tools": [...], "system": "..." }
},
"response": {
"role": "assistant",
"content": [ {"type":"text",...}, {"type":"tool_use",...} ],
"usage": {...}
}
}sessionKey is derived from metadata.user_id if present, otherwise from a hash of the system prompt — so different cwds / Claude Code instances get different sessions automatically.
What's intentionally not here
cc-viewer's full feature set is much larger. This project deliberately omits:
- Markdown rendering, syntax highlighting, mermaid
- Terminal PTY, statistics charts, Electron, mobile layout
- Multi-account / proxy profile switching, hot-reload
- Hooks / plugin system
- Delta storage / incremental log compaction
- MainAgent / subagent classification
If you need any of those, use cc-viewer.
Releasing
A push of a v* tag triggers .github/workflows/release.yml:
- Build
dist/and pack a tarball withnpm pack - Create a GitHub Release with auto-generated notes (tarball attached)
- Run
npm publish --provenanceifNPM_TOKENis configured as a repo secret
Pre-flight checklist (do once, before the first release):
- [ ] Add
NPM_TOKEN(an Automation token from https://www.npmjs.com/settings/~/tokens) under the repo's Settings → Secrets and variables → Actions. Without it the workflow still creates the GitHub Release but silently skips the npm step. - [ ] Confirm the package name
multi-agent-roomis owned by your npm account (or available — first publish claims it). - [ ]
npm pack --dry-runlocally to eyeball the tarball contents.
# bump version, tag, push
npm version patch # or minor / major
git push --follow-tagsLicense
MIT
