npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@aexol/spectral

v0.1.3

Published

Always-on coding agent for Aexol — branded pi wrapper with relay-based browser access.

Downloads

758

Readme

@aexol/spectral

Coding agent that never sleeps.

What is Spectral?

Spectral is the always-on coding agent for Aexol. You install it on every machine you want to code on, run spectral serve, and your devices appear in the Aexol web UI as live agents you can drive from any browser tab — no port forwarding, no SSH, no exposing localhost.

Under the hood, Spectral is a thin branded wrapper around pi by Mario Zechner, plus a relay client that connects each machine to Aexol's backend. Browsers talk to your machines through that relay; the backend never sees your code or your messages — those stay on the device. Model API keys are handled differently depending on which command you run (see How inference is routed).

Install

npm install -g @aexol/spectral

Requires Node.js 20 or newer.

Quickstart

  1. Authenticate

    spectral login

    You'll be prompted for your Aexol MCP URL (defaults to https://api.aexol.ai/mcp) and a team API key (sk-aexol-team-…). Credentials are written to ~/.spectral/config.json with mode 0600.

  2. Start the agent

    spectral serve

    This registers the machine with your team, opens a long-lived WebSocket to the Aexol relay, and stays up. Reconnects are automatic with exponential backoff. Leave it running — that's the point.

  3. Open the browser

    Visit the Aexol web UI. Your machine appears in the picker; pick it and you can create projects, open sessions, and chat with the agent on that machine from any tab.

You can also run Spectral as a plain local TUI without the relay — just invoke spectral (no subcommand) and it acts as a normal pi terminal session with the Aexol MCP extension auto-loaded.

Commands

| Command | Description | |--------------------|-------------------------------------------------------------------------------| | spectral | Local TUI. Forwards all flags to pi; loads the bundled Aexol MCP extension. | | spectral login | Interactive auth. Verifies the key against the MCP backend and stores it. | | spectral logout | Removes ~/.spectral/config.json. Idempotent. | | spectral serve | Connect this machine to the Aexol relay. Stays up; survives reconnects. | | spectral --version | Print version. | | spectral --help | Print Spectral header, then pi's full help. |

spectral serve flags

| Flag | Description | |----------------------------|-------------------------------------------------------------------| | --machine-name <name> | Override the display name (default: os.hostname()). |

Anything that isn't a Spectral subcommand is forwarded verbatim to pi, so any pi flag you know works. Example: spectral -p "summarize this repo".

How it works

   ┌────────────────┐         ┌─────────────────┐         ┌────────────────┐
   │  Browser tab   │────────▶│  Aexol backend  │◀────────│  spectral serve │
   │ (Aexol web UI) │  WSS    │     (relay)     │   WSS   │ (your machine)  │
   └────────────────┘         └─────────────────┘         └────────────────┘
                                       │
                              identity + routing only
                              (no message content, no code)
  • Your machine runs spectral serve and registers with the relay using a machine JWT issued at first run.
  • Browser sessions for that machine open a WebSocket to the backend. The backend forwards every frame to your machine and back — it never reads or stores message content.
  • All your local state — projects, sessions, messages, pi auth tokens — lives on the device. The backend only knows machine identity (id, display name, hostname, last-seen) and team membership.

How inference is routed

Spectral has two distinct execution paths, and they handle model API keys differently. This is intentional — pick the one that matches your security model.

spectral (CLI / TUI mode)            spectral serve (relay mode)
─────────────────────────            ─────────────────────────────
pi reads ~/.pi/agent/auth.json   →   pi runs in-process via PiBridge
   ↓                                    ↓
local Anthropic / OpenAI keys        ALL inference → backend `/v1` proxy
   ↓                                    ↓
direct call to provider              backend uses centralized API keys
                                        ↓
                                     scoped to team's BaseModel whitelist
  • spectral (CLI subprocess mode) — pi runs as a normal subprocess and uses whatever provider keys you've stored locally in ~/.pi/agent/auth.json (Anthropic, OpenAI, Cerebras, Google, custom OpenAI-compatible endpoints). Spectral never reads or transmits these. This is the classic local-only flow.
  • spectral serve (relay mode) — pi runs in-process inside the serve daemon. All inference traffic is proxied through the Aexol backend's /v1/messages and /v1/chat/completions endpoints, authenticated with the per-machine machine JWT. The backend holds the upstream provider keys and enforces a per-team BaseModel whitelist server-side. The local ~/.pi/agent/auth.json is not read in this mode (AuthStorage.inMemory()).

Why two paths? spectral serve is designed for shared / managed machines where the team controls which models are usable and operators don't want provider keys sitting on every box. spectral (no subcommand) is the unmanaged TUI path and behaves like a vanilla pi install.

Configuration

| Path / variable | Purpose | |-------------------------------------------|--------------------------------------------------------| | ~/.spectral/config.json | Aexol MCP URL + team API key. Created by spectral login. Mode 0600. | | ~/.spectral/machine.json | Machine identity + relay JWT. Created on first spectral serve. | | ~/.spectral/sessions.db | Local SQLite for projects, sessions, messages. | | SPECTRAL_CONFIG_DIR | Override the directory above. | | SPECTRAL_MCP_URL | Override the MCP URL at login time. | | SPECTRAL_BACKEND_URL | Override the backend HTTP base for spectral serve. | | SPECTRAL_RELAY_URL | Override the derived relay WebSocket URL. |

Pi's own auth state for the local TUI path (Anthropic, OpenAI, etc.) lives in ~/.pi/agent/auth.json on the same machine. Spectral never reads it and never sends it anywhere. Note that spectral serve does not use this file — it routes inference through the backend proxy instead (see How inference is routed).

Multiple machines

You can run spectral serve on as many machines as you like under one team — each gets its own machine identity and its own SQLite. The browser picker lists all of them; switching machines shows that machine's project list and session history. Switching is a hard context change: the previous selection is cleared so you don't accidentally talk to the wrong device.

Troubleshooting

  • My machine isn't showing up in the browser picker. Make sure spectral serve is still running (it logs reconnect attempts to stderr). If spectral login was run a long time ago and the team key was rotated, re-run spectral login.

  • WebSocket keeps disconnecting. The relay client reconnects automatically with exponential backoff. Brief network blips are expected and handled. If the backoff loop is constant, check that your team API key is still valid and that your network allows outbound WebSocket connections to the configured backend.

  • better-sqlite3 errors on first spectral serve. This usually means the native module didn't compile during install. Try cd ~/.spectral && npm rebuild better-sqlite3, or reinstall Spectral after ensuring you have a working C/C++ toolchain (make, a C compiler, Python).

  • I want to revoke a machine. Stop spectral serve on that device. Machine revocation from the Aexol UI is on the roadmap; today the most reliable approach is to rotate the team API key, which invalidates every machine's relay JWT for that team.

Privacy & data

  • Model API keys:
    • For spectral (CLI / TUI mode): live ONLY on the machine, in pi's own ~/.pi/agent/auth.json. Never read or transmitted by Spectral.
    • For spectral serve (relay mode): live on the backend, not the machine. The local machine holds only its machine JWT; provider keys are managed centrally and scoped to the team's BaseModel whitelist.
  • Code, messages, file contents, generated artifacts live ONLY on the machine, in ~/.spectral/sessions.db and the working directory you point spectral serve at.
  • The backend stores: machine identity (id, display name, hostname, last-seen timestamps), the relay JWT issued at registration, and team membership. For spectral serve, it also holds the centralized provider API keys used to fulfil inference requests on behalf of authorized machines.
  • The backend does not store: prompts, responses, tool calls, files, artifacts, or any other message-channel content.

License

MIT — see LICENSE.

Links

  • Website: https://aexol.com
  • Source is currently hosted in an internal Aexol repository; public mirror TBD.
  • Issues: please file them with your Aexol contact (a public issue tracker is not yet available).