npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@inference/cli

v0.0.6

Published

Inference.net CLI - manage training runs, evals, datasets, and inferences from your terminal

Readme

@inference/cli

Command-line interface for Inference.net -- manage training runs, evals, datasets, and inferences from your terminal.

Installation

Via npm (recommended):

npm install -g @inference/cli

Via npx (no install):

npx @inference/cli --help

Via Bun:

bunx @inference/cli --help

Supported Platforms

| Platform | Architecture | Package | | -------- | --------------------- | ----------------------------- | | macOS | Apple Silicon (arm64) | @inference/cli-darwin-arm64 | | macOS | Intel (x64) | @inference/cli-darwin-x64 | | Linux | x64 | @inference/cli-linux-x64 | | Linux | arm64 | @inference/cli-linux-arm64 |

The correct binary is installed automatically based on your platform.

Quick Start

# Sign in via browser
inf auth login

# List your projects
inf project list

# Select a project
inf project switch <project-id>

# Instrument a codebase with observability (run from your project directory)
inf install

# View recent training runs
inf training list

# Launch interactive dashboard
inf dashboard

Commands

Authentication

inf auth login          # Sign in via browser (device authorization flow)
inf auth logout         # Sign out and clear stored credentials
inf auth status         # Show current authentication status
inf auth set-key <key>  # Set an API key for CI/headless use (sk-observability-*)

Projects

inf project list        # List all projects
inf project switch <id> # Set the active project
inf project current     # Show the currently active project

Install (Automated Observability Setup)

inf install             # Instrument your codebase with Catalyst observability
inf install --dry-run   # Preview changes without modifying files

Automatically instruments your codebase to route LLM API calls through the Inference.net observability proxy. The command:

  1. Confirms your active project and API key
  2. Detects available AI coding agents (Claude Code, OpenCode, or Codex)
  3. Fetches the latest instrumentation skill from inference.net
  4. Launches the selected agent to scan and modify your code

Run this from the root of the project you want to instrument.

Training

inf training list              # List training runs
inf training get <id>          # Get details of a training run
inf training logs <id>         # View training logs (-f to follow)
inf training poll <id>         # Poll status until complete

Evals

inf eval list                  # List eval run groups
inf eval get <id>              # Get eval run group details
inf eval definitions           # List eval definitions
inf eval datasets              # List eval datasets

Datasets

inf dataset list               # List filtered datasets
inf dataset get <id>           # Get dataset details
inf dataset download <id>      # Download a dataset (-o output path)

Inferences

inf inference list             # List recent inferences
inf inference get <id>         # Get full inference details

Dashboard

inf dashboard                  # Launch interactive TUI dashboard

The dashboard provides a tabbed interface for browsing training runs, evals, datasets, and inferences. Use 1-4 to switch tabs, j/k to navigate, Enter to drill down, r to refresh, and q to quit.

Global Options

| Flag | Description | | -------------------- | --------------------------------------------- | | --json | Output results as JSON (useful for scripting) | | -v, --verbose | Enable verbose debug output | | -p, --project <id> | Override the active project for this command | | --version | Show the CLI version | | --help | Show help |

Configuration

Config File

Stored at ~/.inf/config.json. Managed automatically by inf auth commands.

Environment Variables

| Variable | Description | Default | | ---------------- | ----------------------------- | ----------------------------------------- | | INF_API_URL | LLM Ops API base URL | https://observability-api.inference.net | | INF_AUTH_URL | Auth server base URL | Derived from API URL | | INF_UI_URL | Web app URL (for device auth) | Derived from API URL | | INF_API_KEY | API key for headless/CI auth | | | INF_PROJECT_ID | Override active project ID | |

Priority Order

Authentication resolves in this order:

  1. INF_API_KEY environment variable
  2. API key in config file (inf auth set-key)
  3. Session token in config file (inf auth login)

API URL resolves in this order:

  1. INF_API_URL environment variable
  2. apiUrl in config file
  3. Default: https://observability-api.inference.net

CI / Headless Usage

For CI pipelines and automated environments where browser-based login is not possible:

# Set your API key (obtain from Inference.net dashboard under project settings)
export INF_API_KEY=sk-observability-...

# Or persist it in the config file
inf auth set-key sk-observability-...

# Set the project
export INF_PROJECT_ID=your-project-id

# Now all commands work without interactive login
inf training list --json

License

MIT