npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@rafaelpujol/aicommit-cli

v1.1.1

Published

AI-powered git commit message generator with multi-provider support (OpenAI, Claude, Ollama, vLLM, Kimi)

Readme

aicommit

🤖 AI-powered git commit message generator with multi-provider support.

Generate conventional commit messages using AI from OpenAI, Anthropic Claude, Ollama, vLLM, or Kimi.

Features

  • Multi-provider: OpenAI, Anthropic Claude, Ollama, vLLM, Kimi
  • Conventional Commits: Follows the Conventional Commits specification
  • Persistent config: Save your preferred provider and settings
  • Interactive mode: Preview, edit, or cancel before committing
  • Dry run: Test without creating commits

Installation

npm install -g aicommit-cli

Or link it:

cd /path/to/aicommit
npm link

Quick Start

# 1. Set your default provider
aicommit config set provider vllm

# 2. Add files to staging
git add .

# 3. Generate commit message
aicommit

Usage

Basic Commands

aicommit                  # Generate commit with default provider
aicommit --dry-run        # Preview without committing
aicommit -p openai        # Use specific provider
aicommit -m gpt-4o        # Use specific model

Configuration

# Set default provider
aicommit config set provider vllm

# Set default model
aicommit config set model gpt-4o

# Set temperature (creativity: 0-1)
aicommit config set temperature 0.3

# View current config
aicommit config get

# View specific setting
aicommit config get provider

# Delete a setting
aicommit config delete model

Options

| Option | Description | Default | |--------|-------------|---------| | -p, --provider <name> | AI provider | config/provider or 'openai' | | -m, --model <name> | Model name | config/model | | -t, --temperature <number> | AI creativity (0-1) | 0.3 | | --dry-run | Preview only, no commit | false | | --no-edit | Skip edit confirmation | false |

Providers

OpenAI

export OPENAI_API_KEY=sk-...
aicommit -p openai -m gpt-4o

Models: gpt-4o, gpt-4-turbo, gpt-3.5-turbo

Anthropic Claude

export ANTHROPIC_API_KEY=sk-ant-...
aicommit -p anthropic -m claude-sonnet-4-20250514

Models: claude-sonnet-4-20250514, claude-3-5-sonnet-20241022, claude-3-opus-20240229

Ollama (Local)

export OLLAMA_HOST=http://localhost:11434
aicommit -p ollama -m llama3

Runs open-source models locally. Default port: 11434

Models: llama3, llama3.1, mistral, codellama, etc.

vLLM (Local)

export VLLM_HOST=http://localhost:8090
aicommit -p vllm

OpenAI-compatible local inference server. Default port: 8090

# Example: Run vLLM with Docker
docker run --gpus all -v ~/.cache/huggingface:/root/.cache/huggingface \
  -p 8090:8000 \
  --env "HUGGING_FACE_HUB_TOKEN=hf_..." \
  vllm/vllm-openai:latest \
  --model meta-llama/Llama-3.3-70B-Instruct

Kimi (Moonshot)

export MOONSHOT_API_KEY=sk-...
aicommit -p kimi -m kimi-k2.5

Models: kimi-k2.5, kimi-k2, kimi-k2-thinking

Environment Variables

| Variable | Description | Provider | |----------|-------------|----------| | OPENAI_API_KEY | OpenAI API key | openai | | ANTHROPIC_API_KEY | Anthropic API key | anthropic | | MOONSHOT_API_KEY | Moonshot API key | kimi | | OLLAMA_HOST | Ollama server URL | ollama | | VLLM_HOST | vLLM server URL | vllm |

Configuration File

Settings are stored in ~/.aicommit.json:

{
  "provider": "vllm",
  "model": "llama3",
  "temperature": 0.3
}

Commit Format

Messages follow Conventional Commits:

feat(api): add user authentication endpoint

Implemented JWT-based authentication with refresh tokens.
Includes login, logout, and token refresh endpoints.

Examples

# Quick commit with defaults
git add . && aicommit

# Preview only
aicommit --dry-run

# Use different provider just once
aicommit -p anthropic

# Use specific model
aicommit -m gpt-4-turbo

# More creative responses
aicommit -t 0.7

# Deterministic responses
aicommit -t 0.1

# Skip confirmation prompt
aicommit --no-edit

Requirements

  • Node.js >= 18
  • Git repository with staged changes
  • API key (for cloud providers) or local AI server (Ollama/vLLM)

License

MIT