npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

portable-ai

v1.0.0

Published

Install and run a fully offline AI chatbot from a USB drive

Readme

portable-ai

npm version license node platform downloads PRs Welcome TypeScript Ollama

Install and run a fully offline AI chatbot from a USB drive. No internet needed after setup. No data leaves the USB. Truly portable: one USB runs on both Windows and macOS.


Quick Start

npx portable-ai install

That's it. The CLI will:

  1. Detect your USB drive
  2. Let you pick an AI model and chat interface
  3. Download the AI engine and chat app for both Windows and macOS (~6-8 GB + ~300 MB for cross-OS support)
  4. Generate standalone launcher scripts for both OSes

After setup, just double-click start-ai.bat (Windows) or start-ai.command (Mac) on the USB. Works on any Windows or macOS machine. No Node.js required on the target machines.

The first launch on the "other" OS (the one you didn't install from) runs a brief one-time setup that copies the chat app onto the USB — no internet, no re-download. Every launch after that is instant.

Commands

| Command | Description | |---------|-------------| | npx portable-ai install | Set up AI on a USB drive | | npx portable-ai start | Start the AI from a USB drive | | npx portable-ai status | Show what's installed | | npx portable-ai update | Update Ollama engine to latest |

Options

# Specify USB path manually instead of auto-detect
npx portable-ai install --target E:
npx portable-ai start --path E:

Available Models

| Model | Size | Description | |-------|------|-------------| | Dolphin 2.9 Llama 3 8B | 5.7 GB | Uncensored, unbiased | | Llama 3 8B Instruct | 4.9 GB | Meta's general-purpose model | | Mistral 7B Instruct | 4.1 GB | Fast, high-quality responses | | Phi-3 Mini 3.8B | 2.2 GB | Lightweight, great for low-spec hardware |

Chat Interface

  • AnythingLLM — Desktop app with a polished UI

What You Need

  • A USB drive with at least 16 GB free (32 GB recommended)
  • Format the USB as exFAT for cross-platform compatibility
  • An internet connection for the one-time setup
  • Node.js 18+ on the machine running the install (not needed after)

USB Drive Structure

After installation, your USB will contain:

USB Drive/
├── portable-ai.json         # Installation manifest
├── start-ai.bat             # Windows launcher (standalone)
├── start-ai.command         # Mac launcher (standalone)
├── engine/
│   ├── windows/             # Ollama for Windows (extracted)
│   └── mac/                 # Ollama for macOS (extracted)
├── models/                  # AI model files (consumed by Ollama)
├── data/                    # Ollama runtime data (shared across OSes)
├── ui/
│   ├── windows/             # AnythingLLM for Windows
│   │   └── installer/       # Bundled installer if host was Mac — first-run only
│   └── mac/                 # AnythingLLM for macOS
│       └── installer/       # Bundled installer if host was Windows — first-run only
└── ui_data/                 # Your chats & settings (shared across OSes)

Features

  • Truly portable — One install, runs on both Windows and macOS. No re-download when switching machines.
  • Resumable downloads — Interrupted? Just re-run. It picks up where it left off.
  • Idempotent installs — Safe to run multiple times. Skips what's already done.
  • Auto USB detection — Finds removable drives automatically.
  • Standalone launchers — Generated .bat and .command files work without Node.js.
  • Zero footprint — Nothing installed on the host computer.
  • Fully offline — After setup, no internet required.

Not Technical? No Problem

If you're not comfortable with npm or the terminal, follow the step-by-step guide for beginners — it walks you through everything with screenshots and plain language.

Privacy

  • All AI processing happens locally on the USB
  • No data is sent to the cloud
  • No telemetry, tracking, or analytics
  • Chats and settings stay on the USB drive

Troubleshooting

No USB drives detected? Use the --target flag to specify the path manually: npx portable-ai install --target E:

Download interrupted? Just run the install command again. It resumes from where it stopped.

Slow responses? The AI runs on your CPU. Pick the Phi-3 Mini model for faster responses on lower-spec machines. If you have a GPU, Ollama will use it automatically.

First launch is slow? Normal — the model needs to load into memory. Subsequent prompts are faster.

Contributing

PRs are welcome. Please open an issue first to discuss what you'd like to change.

License

MIT