npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

inferrlm

v1.0.5

Published

Terminal client for interacting with the InferrLM local server REST API

Readme

InferrLM CLI

This is an example application that demonstrates how to build apps using the InferrLM REST API. It showcases integration patterns for chat streaming, model management, and server communication.

InferrLM CLI is the terminal companion to the InferrLM mobile app. It connects directly to your InferrLM device server so you can chat with on-device or remote models from any computer while keeping the same streaming experience and conversation controls.

Features

Terminal Chat Experience

  • Interactive onboarding detects your InferrLM server and lists every available model.
  • Conversation history stays in session so you can scroll and review past exchanges without leaving the terminal.
  • Keyboard-driven UI keeps input and response panes focused on speed and clarity.

Streaming and Controls

  • Real-time streaming mirrors the InferrLM app, rendering tokens as soon as they arrive.
  • Retry, stop, and switch-model actions are exposed through key prompts for quick iteration.
  • Output formatting highlights code blocks with syntax coloring and preserves markdown structure.

Server Integration

  • Uses the same REST APIs as the mobile app, including /api/chat and /api/tags.
  • Automatically adapts to whatever models you have downloaded or exposed through the InferrLM server.
  • Falls back gracefully when the device is unreachable, surfacing actionable errors.

Getting Started

Prerequisites

  • Node.js 20 or newer
  • A running InferrLM server on your phone or tablet (Server tab inside the app)
  • Network connectivity between your computer and the device (same WiFi)

Installation

Install the CLI globally using npm:

npm install -g inferrlm

Or use npx to run it without installation:

npx inferrlm

Running the CLI

  1. Start your InferrLM server from the mobile app (Server tab).
  2. Run the CLI:
inferrlm
  1. Follow the guided setup to connect to your server.

When prompted, paste the server URL (for example http://192.168.1.88:8889), choose a model, and begin chatting. Responses will stream token-by-token until completion or until you stop the generation.

Development Setup

If you want to contribute or run from source:

  1. Clone or download the repository.
  2. Move into the CLI workspace and install dependencies.
  3. Build the distributable bundle.
cd inferra-cli
npm install
npm run build

Then run locally:

npm start

Configuration

  • Server URL and model choice are stored only for the active session, mirroring the privacy posture of the InferrLM app.
  • Model discovery happens automatically by calling /api/tags, so the CLI stays up to date with whatever the mobile app exposes.
  • Environment variables are not required, but you can provide INFERRLM_SERVER_URL to skip the onboarding prompt if desired.

Usage Tips

  • Press Enter to send the current prompt; the stream renders inline underneath your input.
  • Use the provided key shortcuts (displayed in the footer) to stop streaming or retry with the same context.
  • Copy any response text directly from the terminal; syntax highlighting stays intact thanks to highlight.js.

Development

The CLI is implemented with React, Ink, and TypeScript. The bundler uses esbuild for fast builds, and Vitest powers automated tests.

# Continuous rebuilds
npm run dev

# Run the test suite
npm test

# Check types and linting
npm run typecheck
npm run lint

Source code layout:

  • src/index.ts bootstraps the Ink tree and command routing.
  • src/core hosts the REST client, streaming parser, and persistence helpers.
  • src/ui contains composable Ink components for the chat transcript, input box, and status footer.

Troubleshooting

  • Cannot connect: Ensure the InferrLM app shows the same IP address you are entering and that both devices share the network.
  • Empty model list: Download at least one model inside the mobile app; the CLI only lists what /api/tags returns.
  • Interrupted streaming: Weak WiFi can drop HTTP streams. Retry closer to the router or switch bands.

Contributing

Contributions follow the same workflow as the main InferrLM app. Open an issue, wait for assignment, then submit a PR that includes tests and lint fixes where applicable. Keep components focused, avoid unnecessary dependencies, and follow the repo TypeScript guidelines.

License

InferrLM CLI is released under the MIT License. See the root LICENSE file for the full text.

Tech Stack

  • React + Ink for terminal rendering
  • TypeScript with strict typings
  • Undici for HTTP streaming
  • esbuild for bundling
  • Vitest for automated testing