npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

llm-catalog

v1.5.0

Published

CLI tool to download LLM model lists from various providers - Developed with AI assistance

Readme

LLM Catalog CLI

CLI tool to download LLM model lists from various providers (OpenAI, Anthropic, Google Gemini) and save them as JSON files.

Quick Start

npx llm-catalog

Requirements

  • Node.js v18+
  • At least one API key (set as environment variables):
    • OPENAI_API_KEY - for OpenAI models
    • ANTHROPIC_API_KEY - for Claude models
    • GOOGLE_API_KEY or GEMINI_API_KEY - for Gemini models

Usage

Command Line Options

npx llm-catalog [options]

Options:
  --output <filename>      Save to file (default: stdout)
  --provider <provider>    Single: openai, anthropic, gemini
                          Multiple: openai,anthropic,gemini  
  --no-filter             Skip interactive mode, output all models 
                          (automatically enables quiet mode for pipelines)
  -h, --help              Show help message
  -v, --version           Show version information

Examples

# Interactive mode (default) - with progress messages and model selection
npx llm-catalog

# Save specific provider to file
npx llm-catalog --provider openai --no-filter --output models.json

# Pipeline usage - clean JSON output without progress messages
npx llm-catalog --provider gemini --no-filter | pbcopy
npx llm-catalog --provider openai --no-filter | jq '.OPENAI.data[].id'

# Compare: Interactive vs Pipeline mode
# Interactive: Shows 🚀 📥 ✓ 📊 🎉 messages + user prompts
# Pipeline:    Clean JSON only, perfect for automation

💡 Tip: --no-filter automatically suppresses all progress messages, making output perfect for piping to jq, pbcopy, or other tools

Environment Setup

Set up your API keys:

export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
export GOOGLE_API_KEY="your-google-key"

Supported Providers

  • OpenAI - GPT-4, GPT-3.5-turbo, etc.
  • Anthropic - Claude 3 series
  • Google Gemini - Gemini Pro, etc.

Output

Creates a JSON file with model data from selected providers:

{
  "OPENAI": {
    "object": "list",
    "data": [
      {
        "id": "gpt-4.1",
        "object": "model",
        "created": 1744316542,
        "owned_by": "system"
      }
    ]
  },
  "ANTHROPIC": {
    "data": [
      {
        "type": "model",
        "id": "claude-sonnet-4-20250514",
        "display_name": "Claude Sonnet 4",
        "created_at": "2025-05-22T00:00:00Z"
      }
    ],
    "has_more": false,
    "first_id": "claude-opus-4-20250514",
    "last_id": "claude-2.0"
  },
  "GEMINI": {
    "models": [
      {
        "name": "models/gemini-2.5-pro",
        "version": "2.5",
        "displayName": "Gemini 2.5 Pro",
        "description": "Stable release (June 17th, 2025) of Gemini 2.5 Pro",
        "inputTokenLimit": 1048576,
        "outputTokenLimit": 65536,
        "supportedGenerationMethods": [
          "generateContent",
          "countTokens",
          "createCachedContent",
          "batchGenerateContent"
        ],
        "temperature": 1,
        "topP": 0.95,
        "topK": 64,
        "maxTemperature": 2,
        "thinking": true
      }
    ],
    "nextPageToken": "Chttb2RlbHMvZ2VtaW5pLWVtYmVkZGluZy0wMDE="
  }
}

Troubleshooting

  • No API keys detected: Check your environment variables
  • API call failed: Verify your API keys and internet connection

Development

This project was developed with AI assistance using VS Code + Claude Sonnet 4. Code, comments, commits, and documentation were generated through AI collaboration.

Note: English expressions in this project may contain unnatural phrasing as they were written or translated through AI assistance.

License

ISC