@aexol/spectral
v0.1.3
Published
Always-on coding agent for Aexol — branded pi wrapper with relay-based browser access.
Downloads
758
Readme
@aexol/spectral
Coding agent that never sleeps.
What is Spectral?
Spectral is the always-on coding agent for Aexol. You install
it on every machine you want to code on, run spectral serve, and your devices
appear in the Aexol web UI as live agents you can drive from any browser tab — no
port forwarding, no SSH, no exposing localhost.
Under the hood, Spectral is a thin branded wrapper around pi by Mario Zechner, plus a relay client that connects each machine to Aexol's backend. Browsers talk to your machines through that relay; the backend never sees your code or your messages — those stay on the device. Model API keys are handled differently depending on which command you run (see How inference is routed).
Install
npm install -g @aexol/spectralRequires Node.js 20 or newer.
Quickstart
Authenticate
spectral loginYou'll be prompted for your Aexol MCP URL (defaults to
https://api.aexol.ai/mcp) and a team API key (sk-aexol-team-…). Credentials are written to~/.spectral/config.jsonwith mode0600.Start the agent
spectral serveThis registers the machine with your team, opens a long-lived WebSocket to the Aexol relay, and stays up. Reconnects are automatic with exponential backoff. Leave it running — that's the point.
Open the browser
Visit the Aexol web UI. Your machine appears in the picker; pick it and you can create projects, open sessions, and chat with the agent on that machine from any tab.
You can also run Spectral as a plain local TUI without the relay — just invoke
spectral (no subcommand) and it acts as a normal pi terminal session with the
Aexol MCP extension auto-loaded.
Commands
| Command | Description |
|--------------------|-------------------------------------------------------------------------------|
| spectral | Local TUI. Forwards all flags to pi; loads the bundled Aexol MCP extension. |
| spectral login | Interactive auth. Verifies the key against the MCP backend and stores it. |
| spectral logout | Removes ~/.spectral/config.json. Idempotent. |
| spectral serve | Connect this machine to the Aexol relay. Stays up; survives reconnects. |
| spectral --version | Print version. |
| spectral --help | Print Spectral header, then pi's full help. |
spectral serve flags
| Flag | Description |
|----------------------------|-------------------------------------------------------------------|
| --machine-name <name> | Override the display name (default: os.hostname()). |
Anything that isn't a Spectral subcommand is forwarded verbatim to pi, so any
pi flag you know works. Example: spectral -p "summarize this repo".
How it works
┌────────────────┐ ┌─────────────────┐ ┌────────────────┐
│ Browser tab │────────▶│ Aexol backend │◀────────│ spectral serve │
│ (Aexol web UI) │ WSS │ (relay) │ WSS │ (your machine) │
└────────────────┘ └─────────────────┘ └────────────────┘
│
identity + routing only
(no message content, no code)- Your machine runs
spectral serveand registers with the relay using a machine JWT issued at first run. - Browser sessions for that machine open a WebSocket to the backend. The backend forwards every frame to your machine and back — it never reads or stores message content.
- All your local state — projects, sessions, messages, pi auth tokens — lives on the device. The backend only knows machine identity (id, display name, hostname, last-seen) and team membership.
How inference is routed
Spectral has two distinct execution paths, and they handle model API keys differently. This is intentional — pick the one that matches your security model.
spectral (CLI / TUI mode) spectral serve (relay mode)
───────────────────────── ─────────────────────────────
pi reads ~/.pi/agent/auth.json → pi runs in-process via PiBridge
↓ ↓
local Anthropic / OpenAI keys ALL inference → backend `/v1` proxy
↓ ↓
direct call to provider backend uses centralized API keys
↓
scoped to team's BaseModel whitelistspectral(CLI subprocess mode) — pi runs as a normal subprocess and uses whatever provider keys you've stored locally in~/.pi/agent/auth.json(Anthropic, OpenAI, Cerebras, Google, custom OpenAI-compatible endpoints). Spectral never reads or transmits these. This is the classic local-only flow.spectral serve(relay mode) — pi runs in-process inside the serve daemon. All inference traffic is proxied through the Aexol backend's/v1/messagesand/v1/chat/completionsendpoints, authenticated with the per-machine machine JWT. The backend holds the upstream provider keys and enforces a per-teamBaseModelwhitelist server-side. The local~/.pi/agent/auth.jsonis not read in this mode (AuthStorage.inMemory()).
Why two paths? spectral serve is designed for shared / managed machines
where the team controls which models are usable and operators don't want
provider keys sitting on every box. spectral (no subcommand) is the
unmanaged TUI path and behaves like a vanilla pi install.
Configuration
| Path / variable | Purpose |
|-------------------------------------------|--------------------------------------------------------|
| ~/.spectral/config.json | Aexol MCP URL + team API key. Created by spectral login. Mode 0600. |
| ~/.spectral/machine.json | Machine identity + relay JWT. Created on first spectral serve. |
| ~/.spectral/sessions.db | Local SQLite for projects, sessions, messages. |
| SPECTRAL_CONFIG_DIR | Override the directory above. |
| SPECTRAL_MCP_URL | Override the MCP URL at login time. |
| SPECTRAL_BACKEND_URL | Override the backend HTTP base for spectral serve. |
| SPECTRAL_RELAY_URL | Override the derived relay WebSocket URL. |
Pi's own auth state for the local TUI path (Anthropic, OpenAI, etc.) lives
in ~/.pi/agent/auth.json on the same machine. Spectral never reads it and
never sends it anywhere. Note that spectral serve does not use this
file — it routes inference through the backend proxy instead (see
How inference is routed).
Multiple machines
You can run spectral serve on as many machines as you like under one team —
each gets its own machine identity and its own SQLite. The browser picker
lists all of them; switching machines shows that machine's project list and
session history. Switching is a hard context change: the previous selection
is cleared so you don't accidentally talk to the wrong device.
Troubleshooting
My machine isn't showing up in the browser picker. Make sure
spectral serveis still running (it logs reconnect attempts to stderr). Ifspectral loginwas run a long time ago and the team key was rotated, re-runspectral login.WebSocket keeps disconnecting. The relay client reconnects automatically with exponential backoff. Brief network blips are expected and handled. If the backoff loop is constant, check that your team API key is still valid and that your network allows outbound WebSocket connections to the configured backend.
better-sqlite3errors on firstspectral serve. This usually means the native module didn't compile during install. Trycd ~/.spectral && npm rebuild better-sqlite3, or reinstall Spectral after ensuring you have a working C/C++ toolchain (make, a C compiler, Python).I want to revoke a machine. Stop
spectral serveon that device. Machine revocation from the Aexol UI is on the roadmap; today the most reliable approach is to rotate the team API key, which invalidates every machine's relay JWT for that team.
Privacy & data
- Model API keys:
- For
spectral(CLI / TUI mode): live ONLY on the machine, in pi's own~/.pi/agent/auth.json. Never read or transmitted by Spectral. - For
spectral serve(relay mode): live on the backend, not the machine. The local machine holds only its machine JWT; provider keys are managed centrally and scoped to the team'sBaseModelwhitelist.
- For
- Code, messages, file contents, generated artifacts live ONLY on the
machine, in
~/.spectral/sessions.dband the working directory you pointspectral serveat. - The backend stores: machine identity (id, display name, hostname,
last-seen timestamps), the relay JWT issued at registration, and team
membership. For
spectral serve, it also holds the centralized provider API keys used to fulfil inference requests on behalf of authorized machines. - The backend does not store: prompts, responses, tool calls, files, artifacts, or any other message-channel content.
License
MIT — see LICENSE.
Links
- Website: https://aexol.com
- Source is currently hosted in an internal Aexol repository; public mirror TBD.
- Issues: please file them with your Aexol contact (a public issue tracker is not yet available).
