npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

tcnt

v1.0.0

Published

A CLI tool to count LLM tokens in text files.

Readme

TokenCount CLI

A Node.js command-line tool to count LLM tokens in text files. It can process local files or read from stdin.

Features

  • Counts tokens using OpenAI's tiktoken library by default.
  • Supports input from file path or stdin.
  • Basic framework for adding other tokenizers.
  • Handles text files only.

Installation

  1. Clone this repository (or ensure you have the tokencount.js and package.json files).
  2. Navigate to the project directory in your terminal.
  3. Install dependencies:
    npm install
  4. Make the script executable:
    chmod +x tokencount.js
  5. Link the package to make the tokencount command available globally:
    npm link

Usage

Count tokens in a file:

tokencount /path/to/your/file.txt

Count tokens from piped input:

cat /path/to/your/file.txt | tokencount

Specify a tokenizer (default is openai-tiktoken):

tokencount --tokenizer openai-tiktoken /path/to/your/file.txt

Currently, openai-tiktoken is the primary supported tokenizer. A placeholder for gemini-text exists but will use openai-tiktoken as a fallback with a warning, as on-device Gemini tokenization is not yet implemented.

Get help:

tokencount --help

Supported Tokenizers

  • openai-tiktoken: Uses the gpt2 encoding from OpenAI's tiktoken library.
  • gemini-text (Placeholder): Currently falls back to openai-tiktoken. On-device support for Gemini tokenization is a future consideration pending available libraries.

Limitations

  • Text Files Only: This tool is designed for text files. Attempting to process binary files will result in an error or incorrect counts.
  • On-Device Tokenization for Gemini: True on-device tokenization for Gemini models is not yet implemented.

Development

To contribute or modify:

  • The project uses ES Module syntax (import/export).
  • The main script is tokencount.js.
  • Tokenizer logic is handled within the .action(...) callback in tokencount.js.
  • To add a new tokenizer, you would typically:
    1. Install any necessary Node.js package for that tokenizer.
    2. Import necessary functions from the package using import.
    3. Add a new else if condition for your tokenizer's name in tokencount.js.
    4. Implement the token counting logic within that block.
    5. Update this README and the help messages.