npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

rsslobster

v0.2.0

Published

Publish to the open web from your phone. Unplatform yourself.

Downloads

224

Readme

RSS Lobster

Publish and read on the open web. Own both sides of the feed.

CI

RSS Lobster is a personal publishing and reading system built on RSS. You publish from your phone to your own site. You subscribe to feeds and read them in the same place. Your content goes out as RSS. Other people's content comes in as RSS. No algorithms. No platform. Just the protocol the web already agreed on.

Cost: $12/year for a domain. Hosting: free (Cloudflare Pages, GitHub Pages, etc.).

Quick start

npm install -g rsslobster
rsslobster onboard          # domain + style → site in 30 seconds

Your site is running at localhost:4321. Now add what you need:

# Read feeds (works immediately — no setup required)
rsslobster feed add https://simonwillison.net
rsslobster feed

# Publish from your phone
rsslobster enable telegram   # set up your Telegram bot
rsslobster start             # start listening

# Add AI classification
rsslobster enable model      # configure Ollama, OpenAI, or Anthropic

# Auto-deploy
rsslobster enable deploy     # git remote for Cloudflare Pages, GitHub Pages, etc.

Each capability is independent. Use what you need, skip what you don't.

What it does

RSS Lobster does two things:

1. Publish. Send a message from any chat app. The lobster classifies it, generates a static HTML page with inlined CSS, updates your RSS and JSON feeds, commits to git, and deploys. Under 4 seconds end-to-end. Zero JavaScript in the output. Your words are never rewritten — the LLM classifies metadata only.

2. Read. Subscribe to feeds. Poll them. Get notified of new items. Star what matters. Reblog to any of your sites as link posts. Generate daily or weekly recaps. OPML import/export so you can bring your subscriptions from anywhere and take them when you leave.

The two sides compose: you read something interesting, you reblog it to your site, your subscribers get it in their readers. The open web feedback loop, running on a protocol from 1999.

publish:                              read:

  phone → classify → html + rss         subscribe → poll → notify
            ↓                                        ↓
        git push → deploy               star → reblog → publish
            ↓                                        ↓
    live in < 4 seconds                 your site ← link post

Publishing

Send a message. The lobster figures out what it is.

| Type | You send | Output | |------|----------|--------| | Micro | A short thought | Tweet-style post | | Post | Longer writing | Full article with title | | Image | Photo with caption | Image post with <figure> | | Carousel | Multiple photos | Gallery layout | | Link | URL with commentary | Link card with metadata | | Video | Video file | Embedded <video> player | | Audio | Audio file | Embedded <audio> player |

Classification is automatic. Drafts and scheduling are built in — say "draft" or set a time and it's handled. Every publish pings WebSub so readers that support it (Feedly, NewsBlur, Inoreader) get your content in seconds.

Reading

rsslobster feed add https://simonwillison.net    # auto-discovers the feed
rsslobster feed                                    # show unread items
rsslobster feed read 3                             # read item #3
rsslobster feed star 2                             # save for later
rsslobster feed reblog 1 -m "This is excellent"    # reblog as link post

Or talk to the lobster directly from chat:

You:      subscribe https://danluu.com
Lobster:  Subscribed to "Dan Luu" — 15 item(s) fetched

You:      unread
Lobster:  15 unread:
          1. In defense of simple architectures
             https://danluu.com/simple-architectures/
          2. ...

You:      read 1
Lobster:  In defense of simple architectures
          by Dan Luu
          ...

You:      reblog 1 This is the post I point people to when they want microservices
Lobster:  Reblogged: "In defense of simple architectures"
          → Link post on your site

Feed management

| Command | What it does | |---------|-------------| | feed add <url> | Subscribe (auto-discovers feed from HTML) | | feed remove <url> | Unsubscribe and remove stored items | | feed list | Unread items, paginated | | feed list --subs | Show subscriptions with unread counts | | feed poll | Fetch all due feeds | | feed poll --force | Fetch all feeds regardless of interval | | feed read <n> | Show item content, mark read | | feed star <n> | Star an item | | feed reblog <n> | Reblog as link post on your site | | feed reblog <n> --to blog | Reblog to a different registered site | | feed mark-read --all | Catch up | | feed items --starred | Show starred items |

Notifications

Per-feed control over what you get notified about and when.

rsslobster feed notify                              # show current settings
rsslobster feed notify --schedule daily --deliver-at 09:00
rsslobster feed notify --quiet-start 22:00 --quiet-end 08:00
rsslobster feed mute https://noisy-feed.com/rss
rsslobster feed filter https://important.com/feed --keywords ai rust

Schedules: immediate, hourly, daily, weekly. High-priority feeds bypass quiet hours. Keyword filters are per-feed (match any term against title + content).

OPML

rsslobster feed import subscriptions.opml    # bring your feeds from anywhere
rsslobster feed export > backup.opml          # take them when you leave

Folder structure is preserved in both directions.

Recaps

rsslobster feed recap              # plain-text daily recap
rsslobster feed recap --enable     # enable AI-powered recaps

When an LLM is configured, recaps summarize the most interesting items across your feeds. Saved to disk so you can review past recaps.

Blogroll

If you have subscriptions, RSS Lobster generates a /following/ page on your site — a blogroll listing every feed you read, grouped by folder, with RSS links. It updates automatically when you subscribe or unsubscribe.

Channels

Any messaging platform can be an input source.

| Channel | Status | |---------|--------| | Telegram | Ready | | Webhook | Ready (curl, IFTTT, Zapier, Shortcuts) | | Discord | Planned | | Slack | Planned | | WhatsApp | Planned | | Signal | Planned | | Nostr | Planned | | Matrix | Planned | | IRC | Planned |

The pipeline is channel-agnostic. It only needs an InboundMessage.

Configuration

Everything lives in lobster.json:

{
  "channel": "telegram",
  "telegram": {
    "token": "your-bot-token",
    "allowedUsers": ["12345"]
  },
  "model": {
    "baseUrl": "http://localhost:11434/v1",
    "model": "llama3",
    "apiKey": "ollama"
  },
  "reader": {
    "defaultInterval": 15
  }
}

The model config supports any OpenAI-compatible API — Ollama locally, or OpenAI/Anthropic/OpenRouter remotely.

Style presets

All presets: system fonts, zero external requests, WCAG AA contrast, zero JavaScript.

| Preset | Vibe | |--------|------| | Minimal | Clean, whitespace-forward | | Brutalist | Raw, monospace, high-contrast | | Magazine | Serif headers, editorial feel | | Terminal | Green-on-black, hacker aesthetic |

Every CSS custom property is overridable. See DESIGN.md for the full design system.

Lifecycle hooks

Shell commands that fire at pipeline stages. Receive JSON on stdin, can return JSON to override behavior.

| Hook | Fires when | Example use | |------|-----------|-------------| | afterClassify | Before publish | Override tags, enforce rules | | afterPublish | HTML + feeds generated | Notify Slack, send analytics | | afterDeploy | Git push complete | Purge CDN, trigger webhook |

Architecture

mysite/
├── rsslobster.json         site config
├── lobster.json            channel + model config
├── posts.json              posts index
├── drafts/                 saved drafts
├── reader/                 feed data (subscriptions, items, config)
│   ├── subscriptions.json
│   ├── unread-index.json
│   ├── config.json
│   └── feeds/              per-feed item storage
└── _site/                  generated output (deploy this)
    ├── index.html
    ├── feed.xml / feed.json
    ├── following/           blogroll
    └── posts/slug/index.html
src/
├── agent/       LLM classification + pipeline
├── channels/    messaging platform adapters
├── cli/         command handlers
├── config/      types, paths, workspace registry
├── deploy/      git commit + push
├── drafts/      draft lifecycle
├── generator/   HTML, RSS, JSON Feed, search, SEO
├── hooks/       lifecycle hooks
├── images/      image + media ingestion
├── pages/       custom pages
├── plugins/     plugin system
├── previews/    draft preview flow
├── reader/      RSS reader (subscribe, poll, store, notify, recap)
├── styles/      CSS preset system
└── index.ts     CLI entrypoint

Design principles

  • Files as the API — git is the database, the filesystem is the state
  • Zero JavaScript in output — generated sites work without JS
  • AI classifies, never rewrites — your words are yours
  • RSS in, RSS out — the same protocol for publishing and reading
  • Composition over abstraction — functions that take data and return data
  • Concurrency-safe — per-feed and per-index locks, atomic writes, no corrupt reads

Internals worth knowing

Dedup strategy: Items are deduplicated by id (guid/atom id) > link > SHA-256 hash of title+content. Three layers, zero duplicates.

Unread index: A lightweight cache of unread item references. Fast path for listItems({read: false}) loads only the feeds that contain unread items. Self-heals on corruption via background rebuild.

Polling: Bounded concurrency (5 feeds at a time). Conditional GET with ETag/If-Modified-Since. Exponential backoff for failing feeds (caps at ~24h). Feeds that go permanently offline don't waste your bandwidth.

Notifications: Evaluated at ingestion time for immediate delivery, queued for batched schedules. Quiet hours only suppress immediate notifications — batched digests still include all items.

Multi-site

Register multiple sites and publish across them from one CLI.

rsslobster sites add blog ~/workspace/myblog
rsslobster sites add photo ~/workspace/myphotos
rsslobster sites

Reblog an item from your reader to a specific site:

rsslobster feed reblog 1 --to blog -m "This is worth reading"

Publish directly to a registered site:

rsslobster publish --to photo --type image "Austin sunset"

Cross-site operations are local-only by default. Add --deploy to git commit and push the target site. The site registry lives at ~/.rsslobster/sites.json.

CLI reference

| Command | Description | |---------|-------------| | rsslobster | Status dashboard (in a configured directory) | | rsslobster onboard | Interactive setup (domain + style) | | rsslobster enable <cap> | Enable a capability: telegram, model, deploy | | rsslobster enable --list | Show what's configured | | rsslobster start | Start the daemon | | rsslobster publish <text> | Publish from CLI (requires --type without a model) | | rsslobster dev | Local preview server (localhost:4321) | | rsslobster regenerate | Rebuild all pages | | rsslobster drafts | Manage drafts | | rsslobster feed | RSS reader (see Reading) | | rsslobster sites | List, add, remove registered sites | | rsslobster publish --to <site> | Publish to a different registered site |

Docker

docker build -t rsslobster .
docker run -v /path/to/site:/site rsslobster

Development

git clone https://github.com/HectorZarate/rsslobster.git
cd rsslobster && pnpm install
pnpm check    # lint + typecheck + 953 tests
pnpm lint           # oxlint
pnpm typecheck      # tsc --noEmit
pnpm test           # vitest
pnpm test:watch     # vitest watch mode
pnpm build          # tsdown → dist/

Requires Node.js >= 22 and pnpm >= 10. Pre-commit hook runs pnpm check.

Troubleshooting

sharp install failures: RSS Lobster uses sharp for image processing. It downloads prebuilt native binaries during install. If this fails (corporate proxy, unusual architecture):

npm install --ignore-scripts rsslobster
npx sharp install

Node version: Requires Node.js >= 22. Check with node --version.

Contributing

See CONTRIBUTING.md.

License

MIT — Hector Zarate