npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

explain-me

v1.0.8

Published

A local CLI tool to explain source code using LLaMA models powered by llama.cpp

Readme

explain-me

🧠🚀 A local CLI tool to explain source code using LLaMA models powered by llama.cpp.


What is explain-me?

explain-me is a developer-friendly command-line tool that helps you understand and explain source code by running lightweight, open-source LLMs (Large Language Models) entirely on your own machine. No cloud, no data leaks — just fast, private, and powerful code explanations right from your terminal.


Why use explain-me?

  • Privacy first: All code and model processing happen locally. No data is sent anywhere.
  • Lightweight: Uses efficient LLaMA-compatible GGUF models via the blazing-fast llama-cli.
  • Customizable: Supply your own prompt to tailor explanations to your needs.
  • Zero dependencies: Bundled with a precompiled llama-cli binary for macOS ARM64 (Apple Silicon).
  • Multi-language: Works on any source code — Go, Python, JavaScript, TypeScript, JSON, and more.
  • Easy to use: Simple CLI interface designed for developers, powered by Go.
  • Open source: Contributions and improvements are welcome!

Current Status

  • ✅ Supports macOS ARM64 (Apple Silicon) with embedded llama-cli binary.
  • 🚧 Linux and Windows support coming soon (requires cross-compiling llama-cli).
  • Requires a local LLaMA-compatible model in GGUF format (see below).

Prerequisites

Before running explain-me, you need:

  • A LLaMA-compatible GGUF model (such as Mistral 7B GGUF)
  • Set the environment variable MODEL_PATH to the absolute path of your model file:

export MODEL_PATH=/path/to/your/model.gguf

## Usage

# Analyze a single source file
npx explain-me -f ./path/to/file.go

# Analyze all files in a directory (non-recursive)
npx explain-me -d ./path/to/directory

# Use a custom prompt to tailor the explanation
npx explain-me -f ./path/to/file.py --prompt "Summarize this function briefly:"

# Summarize code with the built-in summary mode
npx explain-me -f ./path/to/file.js --summary

# Check code for bugs and bad practices
npx explain-me -f ./path/to/file.ts --bug-check

# Start interactive chat mode
npx explain-me --chat-mode

What happens if you run without arguments?

If you run npx explain-me without specifying a file path, the CLI will:

Display an error message explaining that the file path is required.

Show usage instructions and available options.

Exit without running the model.

$ npx explain-me

❌ Usage: explain-me -f <file_path> OR -d <directory_path> [--prompt "custom prompt"] [--summary] [--bug-check] [--chat-mode]

Usage: explain-me <file_path> [options]

Options:
  -prompt string
        Custom prompt for the model
  -h, --help
        Show help message