npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@edular/queue

v3.0.0

Published

![current npm version](https://img.shields.io/npm/v/@edular/queue?label=%40edular%2Fqueue)

Readme

@edular/queue

current npm version

Node version required: >=18.0.6

Installation

yarn add @edular/queue

Connect

import { queue, QueueList, EventType } from "@edular/queue"

await queue.connect({
  microserviceName: "tasks-api",
  hostname: "localhost",
  port: 5672,
  username: "guest",
  password: "guest",
  queues: [
    { name: QueueList.MicroserviceTasks },
    { name: QueueList.UpdateUserInTasks, enableDeadLetter: true, maxRetries: 5 },
    { name: QueueList.TasksApproved,     enableDeadLetter: true, maxRetries: 20 },
    { name: QueueList.PostTaskApproval },
  ],
})

QueueOptions

| Option | Type | Default | Description | |--------------------|-------------------|-------------|--------------------------------------------------------------| | microserviceName | string | required | Name of the microservice, included in dead letter payloads | | queues | QueueConfig[] | required | List of queues to configure | | hostname | string | localhost | RabbitMQ host | | port | number | 5672 | RabbitMQ port | | username | string | guest | RabbitMQ username | | password | string | guest | RabbitMQ password | | maxRetries | number | 10 | Default max retries before dead lettering (per-queue overrides this) | | maxReconnectDelay| number | 900 | Max delay in seconds between reconnection attempts to RabbitMQ | | prefetch | number | 1 | Default prefetch count for all consumers |

QueueConfig

| Option | Type | Default | Description | |--------------------|------------|-----------|------------------------------------------------------| | name | QueueList| required | Queue name | | enableDeadLetter | boolean | false | Whether to send to dead letter after max retries | | maxRetries | number | inherited | Max retries before dead lettering (overrides global) |


Send a message

queue.send({
  queue: QueueList.MicroserviceTasks,
  message: JSON.stringify({ type: "CREATE_TASK", payload: { ... } }),
})

Send to multiple queues

queue.sendMultiple(
  [QueueList.Email, QueueList.Sms],
  JSON.stringify({ type: "NOTIFY", payload: { ... } })
)

Consume messages

The callback must return true on success, false or an Error on failure.

queue.on(EventType.Connected, () => {
  queue.consume(QueueList.MicroserviceTasks, async (msg) => {
    try {
      const { type, payload } = JSON.parse(msg.message)
      await handleMessage(type, payload)
      return true
    } catch (e) {
      logger.error(e)
      return e  // real error is forwarded to dead letter if enabled
    }
  })
})

Retry behavior

When the callback returns false or an Error, the message is re-queued with exponential backoff:

| Attempt | Delay | |---------|--------------| | 1 | ~2s | | 2 | ~4s | | 3 | ~8s | | 4 | ~16s | | 5+ | ~30s (cap) |

A random jitter of up to 1s is added to each delay to avoid thundering herd.

If enableDeadLetter: true is set for the queue and maxRetries is exceeded, the message is sent to the dead_letter queue instead of being retried.


Dead Letter Queue

When a message exceeds maxRetries, it is sent to QueueList.DeadLetter with the following payload:

{
  "microserviceName": "tasks-api",
  "originalQueue": "microservice_tasks",
  "originalMessageId": "uuid",
  "payload": "{...original message...}",
  "retryCount": 10,
  "failedAt": "2026-04-10T15:30:00.000Z",
  "error": {
    "message": "Something went wrong",
    "stack": "Error: Something went wrong\n    at ..."
  },
  "originalHeaders": {}
}

To consume dead letter messages:

queue.consume(QueueList.DeadLetter, async (msg) => {
  const data = JSON.parse(msg.message)
  logger.error(`Dead letter from ${data.microserviceName}/${data.originalQueue}`, data)
  return true
})

Other methods

pause(queue) / resume(queue)

Temporarily stop and restart a consumer without losing its configuration.

await queue.pause(QueueList.MicroserviceTasks)
await queue.resume(QueueList.MicroserviceTasks)

setPrefetch(queue, count)

Dynamically change the prefetch count for a queue. Restarts the consumer if active.

await queue.setPrefetch(QueueList.MicroserviceTasks, 5)

getQueueStats(queue)

Get message counts for a queue.

const stats = await queue.getQueueStats(QueueList.MicroserviceTasks)
// { ready: 10, unacked: 2, total: 12 }

disconnect()

Gracefully close the connection.

await queue.disconnect()

How to publish a new version

  1. Commit your changes
  2. yarn version major|minor|patch -m "bump %s"
  3. npm publish