npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@sktripamer/usefullm

v0.1.0

Published

Speed up your workflows when working with LLMs for a more collaborative experience

Downloads

5

Readme

UsefulLM CLI

Expedite your workflows when working with LLMs to enable a more efficient and collaborative development process. Copy entire directory trees and their files into an LLM friendly format, bulk summarization of files, and more coming soon.

Vastly speed up your work with LLMs, especially when refactoring large codebases and adding new features into existing systems.

Features

  • Copy Directory Contents (cpdir):
    Copy the contents of a directory to your clipboard in markdown format. Each file is formatted with a markdown header and code block. Binary files are automatically marked as [binary file].

  • Generate Directory Tree (dirtree):
    Generate a directory tree and copy it to your clipboard in various formats (markdown, JSON, or XML). Filter by file extensions, exclude specific paths, and control the depth of the tree.

  • Summarize Files (summ):
    Summarize the contents of a file or directory using an LLM. The summary is copied to your clipboard for easy sharing.

  • Command History (history):
    View and manage a history of your five most recent commands and their outputs. Copy specific outputs back to your clipboard for reuse.

  • Last Command Output (last):
    Quickly copy the output of your most recent command to the clipboard.

  • Configuration Management (config):
    View or update your LLM configuration, including the API URL and API key.


Installation

To install UsefulLM globally, run:

npm install -g @sktripamer/usefullm

If you want to use the summarize command (and many more commands to follow), you'll need an LLM configured with an OpenAI-compatible LLM. Here's the easiest way to get started:

Download Ollama and make sure the CLI gets installed (Run the application and it should prompt you to install the CLI).

After the CLI is installed, run:

ollama pull llama3.2
ollama serve

Now you're ready to go with LLM enabled commands! The default config is set up for the above setup, but you can update it to anything you want:

# Set LLM API URL
usefullm config url http://localhost:8000/v1/chat/completions

# Set LLM model (this is the one that's running on your local server or remote service)
usefullm config model llama3.3

The default Llama3.2 is a 3B parameter model, having a good balance between performance and capacity. You can download and run more powerful ones, just make sure to update the configuration. The v1/chat/completions endpoint is the most tested one, so it's probably the best to use.


Commands

1. Copy Directory Contents (cpdir)

Copy the contents of a directory to your clipboard in markdown format.

usefullm cpdir [directory] [options]

Options:

  • --all: Include all files, ignoring .gitignore patterns.

Examples:

# Copy current directory contents (respecting .gitignore)
usefullm cpdir

# Copy specific directory contents (respecting .gitignore)
usefullm cpdir /path/to/directory

# Copy all files, including those in .gitignore
usefullm cpdir --all

2. Generate Directory Tree (dirtree)

Generate a directory tree and copy it to your clipboard in various formats.

usefullm dirtree [directory] [options]

Options:

  • -d, --depth <depth>: Maximum depth to traverse (default: 20).
  • -i, --ignore <patterns...>: Patterns to ignore (e.g., node_modules).
  • -a, --all: Include all files, ignoring .gitignore rules.
  • -f, --format <format>: Output format (markdown, json, or xml) (default: markdown).

Examples:

# Generate tree for current directory in markdown format
usefullm dirtree

# Generate JSON tree for specific directory
usefullm dirtree ./my-project -t json

# Generate tree excluding node_modules
usefullm dirtree -i "node_modules"

# Generate tree with maximum depth of 2
usefullm dirtree -d 2

3. Summarize Files (summ)

Summarize the contents of a file or directory using an LLM.

usefullm summ [path] [options]

Options:

  • -t, --tokens <tokens>: Maximum tokens per summary (default: 1000).
  • -a, --all: Include all files, ignoring .gitignore rules.

Examples:

# Summarize current directory
usefullm summ

# Summarize specific file
usefullm summ /path/to/file.txt

# Summarize directory with custom token limit
usefullm summ ./my-project -t 500

4. Command History (history)

View or copy specific outputs from your command history.

usefullm history [index]

Examples:

# View full command history
usefullm history

# Copy output of specific history item to clipboard
usefullm history 2

5. Last Command Output (last)

Copy the output of your most recent command to the clipboard.

usefullm last

6. Configuration Management (config)

View or update your LLM configuration.

usefullm config [key] [value]

Examples:

# View current configuration
usefullm config

# Set LLM API URL
usefullm config url http://localhost:8000/v1/chat/completions

# Set LLM model (this is the one that's running on your local server or remote service)
usefullm config model llama3.2

# Set LLM API key
usefullm config key your-api-key

# Clear your API key
usefullm config key

Contributing

Have a common workflow in mind? Open an issue on the GitHub repository and I can get you set up. Your feedback is highly appreciated!