npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

semlocal

v0.2.0

Published

Local semantic search CLI — write, search, and delete text with vector embeddings, no backend required.

Readme

semlocal

Release CI codecov npm

Local semantic search for the command line. Store, search, and delete text using vector embeddings — no backend, no API keys, everything stays on your machine.

Why

Semantic search tools today assume you have infrastructure: a vector database to run, an embedding API to call, credentials to manage. That's fine for production systems, but overkill when all you need is a lightweight way to index and recall text — especially for AI agents that benefit from long-term memory.

semlocal is a single binary you install with npm and run immediately. Embeddings are generated locally via ONNX Runtime, stored in a SQLite file, and searched with brute-force cosine similarity. No servers, no API keys, no Docker containers. Just a CLI that reads and writes to disk.

Use it to give agents persistent, searchable memory without any compute or infrastructure overhead.

Install

npm install -g semlocal

Usage

Store text

semlocal write "Rust is a systems programming language focused on safety"
# prints: a1b2c3d4-e5f6-7890-abcd-ef1234567890

You can also pipe text from another command or a file:

cat README.md | semlocal write
echo "hello world" | semlocal write -

Search

semlocal search "safe low-level language"
# [0.87] a1b2c3d4-... Rust is a systems programming language focused on safety

Return results as JSON:

semlocal search "safe low-level language" --json --top 3

Delete an entry

semlocal delete a1b2c3d4-e5f6-7890-abcd-ef1234567890

Collections

Entries are organized into collections. If not specified, all operations use the default collection. Use --collection to partition your data:

semlocal write "Rust is fast" --collection languages
semlocal search "performance" --collection languages
semlocal delete a1b2c3d4-... --collection languages

Collections are created implicitly on first write and removed automatically when their last entry is deleted.

Custom storage directory

By default the index is stored in .semlocal/ in the current working directory. Use --src to change this:

semlocal write "hello world" --src ~/my-index
semlocal search "greeting" --src ~/my-index

How it works

semlocal uses FastEmbed (ONNX Runtime) to generate 384-dimensional vector embeddings with the all-MiniLM-L6-v2 model. Embeddings are stored in a local SQLite database. Search is performed via brute-force cosine similarity over all stored entries.

First run

The embedding model (~25 MB) is downloaded automatically on first use and cached in ~/.semlocal/models/. Subsequent runs start instantly.

Platforms

Pre-built binaries are provided for:

| OS | x64 | arm64 | |---|---|---| | Linux | ✓ | ✓ | | macOS | | ✓ | | Windows | ✓ | |

License

MIT