npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

kafkaish

v0.0.7

Published

A publish-subscribe messaging system with durable topics on MongoDB. A poor-man's Apache Kafka, if you like.

Readme

Kafkaish

Stability: very unstable / experimental.

Durable message queues are handy when you only have a few consumers, but they're a bit unwieldy and wasteful when you have many consumers or consumers come and go over time.

Message topics are handy when you have large/varying numbers of subscribers that come and go, but sometimes you need each consumer to receive all messages regardless whether they were connected when the message was dispatched.

Kafka topics have a nice mix of the properties of topics and queues - publish/subscribe plus guaranteed delivery.

This lib is an experiment with an Apache-Kafka-like publish/subscribe mechanism based on MongoDB.

Q. Why not just use Kafka?

A. Actually I really want to, and of course you should - if you have the resources. That said, in a resource constrained environment (little money, few people) where Mongo is already part of the infrastructure, sometimes the only choice you have is to use what you've already got. I need a moderately reliable Kafka-like mechanism without the additional resource requirements and overhead of deploying and managing kafka.

Usage

Install:

npm install kafkaish

Bring kafkaish into scope:

const kafkaish = require('kafkaish')

Connect and create a topic:

kafkaish('mongodb://localhost:27017/kafkaish').connect()
  .then(conn => {
    conn.prepareTopic('my_topic')
      .then(topic => {
        // use your topic
      })
  })

Publish messages to your topic fire-and-forget style:

topic.publish('event-name',{foo:'bar'})

Publish messages and receive a callback for confirmation of publishing:

topic.publish('event-name',{foo:bar},function(err){
  if (err) {
    // not published! try again or whatever
  } else {
    // yay, subscribers will receive the message
  }
})

Subscribe to receive specific events published from now on until you unsubscribe or otherwise disconnect:

topic.subscribe('event-name',{},function(ev,msg){
  // handle event
})

Use the returned subscription object to unsubscribe when you've had enough:

const count = 0
const subscription = topic.subscribe('event-name',{},function(ev,msg){
  count++
  if (count === 5) { // disconnect after 5 events
    subscription.unsubscribe()
  }
})

Subscribe to receive ALL events published from now on until you unsubscribe or otherwise disconnect:

topic.subscribe(null,{},function(ev,msg){
  // handle event
})

Subscribe a durable subscription to receive specific events. You can unsubscribe/disconnect and come back later to collect events that occurred while you were away:

topic.subscribe('some-event',{name:'durable-subscriber-1'},function(ev,msg,ack){
  // here we'll see any events published from now on.
  // if we disconnect and re-connect with "replay:true"
  ack() // acknowledge this message so we don't replay it if we re-connect
})

Subscribe a durable subscription to receive specific events, replaying the backlog of events that already occurred. You can unsubscribe/disconnect and come back later to collect events that occurred while you were away:

topic.subscribe('some-event',{name:'durable-subscriber-1',replay:true},function(ev,msg,ack){
  // here we'll see all events that already occurred before settling in to receive
  // events in real-time as they are published. We need to ack() each event.
  ack()
})