npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@aria_asi/cli

v0.2.40

Published

Aria Smart CLI — the world's first harness-powered terminal companion

Readme

@aria_asi/cli

Aria Rosewood — the harness-bound AI partner CLI.

One command installs smart hooks, Coach governance, and runtime enforcement across all your coding agents: Claude Code, OpenAI Codex, and OpenCode.

npm install -g @aria_asi/cli
aria setup      # interactive onboarding
aria connect    # wire all CLIs

What It Does

Aria wraps your existing LLM-powered tools with a cognitive harness — a local runtime that enforces truth, quality, and safety gates before destructive actions or unverified claims leave your machine.

  • Coach Kernel — deterministic gatekeeper blocks rm -rf, sudo, kubectl apply, git push --force, and 20+ other destructive/deploy patterns unless you provide verification evidence
  • 8-Lens Cognition — multi-perspective reasoning contract (truth, harm, trust, power, reflection, context, impact, beauty) injected into every managed LLM turn
  • Mizan Output Gate — validates every assistant response for cognition substance, evidence grounding, and completion-claim integrity before release
  • Client-Agnostic — same governance across Claude Code, Codex (GPT-5.5), and OpenCode (DeepSeek)
  • Offline-First — full local runtime at :4319; no cloud dependency for core enforcement

Quick Start

1. Install

npm install -g @aria_asi/cli

2. Onboard

aria setup

The wizard asks for your LLM provider and API key, validates them, and wires everything. If the Aria Soul backend is unreachable, a local-only fallback generates an offline license and connects your CLIs immediately.

3. Connect

aria connect

Installs hooks on Claude Code, Codex, and OpenCode. Starts the mounted runtime and enforcement bridge.

4. Open a new session

Open a fresh Claude Code or Codex window. Your first tool call will trigger the Coach gate. Destructive actions without cognition will be blocked.

Supported CLIs

| CLI | Status | Governance | |-----|--------|------------| | Claude Code | Wired | Coach + 8-lens cognition + Mizan output gate | | OpenAI Codex (GPT-5.5) | Wired | Coach + runtime proxy via OPENAI_BASE_URL | | OpenCode (DeepSeek) | Wired | Coach + harness context injection |

Coach Enforcement

The Coach Kernel evaluates every tool call and output claim:

| Condition | Decision | |-----------|----------| | Destructive (rm -rf, sudo, kill -9, git push --force, chmod 777, drop table, docker rm -f, --no-verify, --no-gpg-sign) | hard_block | | Deploy without verify (kubectl apply, helm upgrade, terraform apply, docker push) | hard_block | | Deploy with verify evidence | allow | | Missing cognition on non-trivial output | repair_once | | Missing applied cognition contract | repair_once | | Unsupported completion/verification claim | repair_once | | Secret/credential exposure | hard_block |

Architecture

npm install -g @aria_asi/cli
        │
        ▼
   aria connect          ← one command wires everything
        │
   ┌────┼────┬──────────┐
   ▼    ▼    ▼          ▼
 Claude Codex OpenCode Cursor
   │    │     │
   └────┼─────┘
        ▼
 Mounted Runtime (:4319)   ← local sidecar
   ├─ Coach Kernel          ← deterministic gatekeeper
   ├─ Harness Daemon (:8790)← packet + skill injector
   ├─ Qdrant Memory (:6333) ← local garden persistence
   └─ Provider Proxy        ← OpenAI/Anthropic API routing

Local vs Full Mode

| Feature | Local (aria connect --local) | Full (aria login <token>) | |---------|-------------------------------|----------------------------| | Coach governance | Yes | Yes | | Hook enforcement | Yes | Yes | | Runtime + provider proxy | Yes | Yes | | Local memory (Qdrant) | Yes | Yes | | Harness packet context | Yes | Yes | | Hive session continuity | No | Yes | | Garden memory sync | No | Yes | | Forge synthesis | No | Yes | | License upgrade path | Yes → aria login | N/A |

BYOK Model

You bring your own API keys. Aria never touches your LLM credentials — they stay in your local ~/.aria/ config. Your LLM provider bills you directly. Aria charges nothing for local enforcement.

Commands

aria setup              Interactive onboarding (offline-capable)
aria connect            Wire all CLIs with Coach enforcement
aria connect --local    Local-only setup (no server required)
aria connect --force    Reinstall hooks even if unchanged
aria status             Show owner mode status
aria runtime start      Start the runtime in foreground
aria runtime restart    Restart the mounted runtime service
aria login <token>      Activate full license from Aria Soul
aria github connect     Optional GitHub integration

Troubleshooting

aria status                  # "Owner mode active" = everything works
curl http://127.0.0.1:4319/health  # Runtime health check
curl http://127.0.0.1:4319/coach/state  # Coach state summary
systemctl --user status aria-mounted-runtime  # Systemd status

Privacy

  • Your API keys stay local in ~/.aria/ (mode 0600)
  • Coach events record only hashed text previews, not full content
  • No telemetry leaves your machine without explicit opt-in
  • Local Qdrant memory is on-disk at ~/.aria/qdrant/

License

Proprietary. See CLIENT-ONBOARDING.md for full terms.