npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

dev-mate-cli

v1.0.3

Published

command line tool using LLMs for code documentation

Readme

dev-mate-cli

A command-line tool that leverages OpenAI's Chat Completion API to document code with the assistance of AI models.

Watch this Demo video to view features.

Features

  • Source Code Documentation: Automatically generate comments and documentation for your source code.
  • Multiple File Processing: Handle one or multiple files in a single command.
  • Model Selection: Use AI model of your choice with the --model flag.
  • Custom Output: Output the results to a file with the --output flag, or display them in the console.
  • Stream Output: Stream the LLM response to command line with --stream flag.

Installation

npm install -g dev-mate-cli

Environment Variables

dev-mate-cli needs API_KEY and BASE_URL to generate responses, these variables should be stored in a .env file within the current directory. Make sure to use the API_KEY and BASE_URL from the same OpenAI-compatible completion API provider.

API_KEY=your_api_key
BASE_URL=https://api.openai.com/v1

Popular providers - OpenRouter, Groq, OpenAI.

Usage

Basic Usage

To run the tool, specify one or more source files or folders as input:

dev-mate-cli ./examples/file.js

For processing multiple files:

dev-mate-cli ./examples/file.js ./examples/file.cpp

For processing folders:

dev-mate-cli ./examples

Command-line Options

  • -m, --model <model-name>: Choose the AI model to use (default: google/gemma-2-9b-it:free from OpenRouter).

    dev-mate-cli file.js -m "openai/gpt-4o-mini"
  • -o, --output <output-file>: Write the output to a specified file.

    dev-mate-cli file.js -o output.js
  • -t, --temperature <value>: Set the creativity level of the AI model (default: 0.7).

    dev-mate-cli file.js -t 1.1
  • -u, --token-usage: Display token usage information

    dev-mate-cli file.js -u
  • -s, --stream: Stream response to command line

    dev-mate-cli file.js -s

Additional Commands

  • Check Version: To check the current version of the tool, use:
    dev-mate-cli --version
  • Help: Display the help message listing all available options:
    dev-mate-cli --help

Examples

  • Document a JavaScript file and save the result:

    dev-mate-cli ./examples/file.js --output file-documented.js --model google/gemini-flash-8b-1.5-exp
  • Process multiple files and print output to the console:

    dev-mate-cli ./examples/file.js ./examples/file.py --model google/gemini-flash-8b-1.5-exp

LLM Configuration

To use a file for LLM configuration, create a dotfile named .dev-mate-cli.toml in the home directory of your system.

Ex: ~/.dev-mate-cli.toml:

model = "gpt-4o"
temperature = "1"

Contributing

Contributions are welcome! If you find a bug or have an idea for an improvement, feel free to open an issue or submit a pull request, view Contribution Guidelines for more details.

License

This project is licensed under the MIT License.