npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@prsm/queue

v2.0.0

Published

Redis-backed distributed task queue with grouped concurrency, retries, and rate limiting

Readme

@prsm/queue

Redis-backed distributed task queue with grouped concurrency, retries, and rate limiting.

Installation

npm install @prsm/queue

Quick Start

import Queue from '@prsm/queue'

const queue = new Queue({
  concurrency: 2,
  maxRetries: 3
})

queue.process(async (payload) => {
  return await doWork(payload)
})

queue.on('complete', ({ task, result }) => {
  console.log('Done:', task.uuid, result)
})

queue.on('failed', ({ task, error }) => {
  console.log('Failed after retries:', task.uuid, error.message)
})

await queue.ready()
await queue.push({ userId: 123, action: 'sync' })

Options

const queue = new Queue({
  concurrency: 2,           // worker count
  delay: '100ms',           // pause between tasks (string or ms)
  timeout: '30s',           // max task duration
  maxRetries: 3,            // attempts before failing

  groups: {
    concurrency: 1,         // workers per group
    delay: '50ms',
    timeout: '10s',
    maxRetries: 3
  },

  redisOptions: {
    host: 'localhost',
    port: 6379
  }
})

Process Handler

queue.process(async (payload, task) => {
  console.log('Task:', task.uuid, 'Attempt:', task.attempts)
  return await someWork(payload)
})

Throw an error to trigger retry. After maxRetries, the task fails permanently.

Grouped Queues

Isolated concurrency per key - perfect for per-tenant rate limiting.

const queue = new Queue({
  groups: { concurrency: 1, delay: '50ms' }
})

queue.process(async (payload) => {
  return await callExternalAPI(payload)
})

await queue.ready()

await queue.group('tenant-123').push({ action: 'sync' })
await queue.group('tenant-456').push({ action: 'sync' })

Each tenant processes independently. One slow tenant won't block others.

Events

queue.on('new', ({ task }) => {})
queue.on('complete', ({ task, result }) => {})
queue.on('retry', ({ task, error, attempt }) => {})
queue.on('failed', ({ task, error }) => {})
queue.on('drain', () => {})

Task Object

{
  uuid: string,
  payload: any,
  createdAt: number,
  groupKey?: string,  // present when pushed via group()
  attempts: number
}

Rate Limiting Example

20 LLM calls/sec per tenant:

const queue = new Queue({
  groups: { concurrency: 20, delay: '50ms' },
  maxRetries: 3
})

queue.process(async ({ prompt }) => {
  return await llm.complete(prompt)
})

app.post('/api/generate', async (req, res) => {
  const { tenantId, prompt } = req.body
  const taskId = await queue.group(tenantId).push({ prompt })
  res.json({ queued: true, taskId })
})

WebSocket Integration with mesh

Queue events are local-only - only the server that processes a task emits complete/failed. Use mesh to push results to connected clients in real time.

Send results to a specific client:

import Queue from '@prsm/queue'
import { MeshServer } from '@mesh-kit/server'

const mesh = new MeshServer({ redis: { host: 'localhost', port: 6379 } })
const queue = new Queue({ groups: { concurrency: 1 } })

queue.process(async (payload) => {
  return await generateReport(payload)
})

queue.on('complete', ({ task, result }) => {
  mesh.sendTo(task.payload.connectionId, 'job:complete', result)
})

queue.on('failed', ({ task, error }) => {
  mesh.sendTo(task.payload.connectionId, 'job:failed', { error: error.message })
})

mesh.exposeCommand('generate-report', async (ctx) => {
  const taskId = await queue.group(ctx.connection.id).push({
    connectionId: ctx.connection.id,
    ...ctx.payload,
  })
  return { queued: true, taskId }
})

await queue.ready()
await mesh.listen(8080)

Both queue and mesh use the same Redis instance. No key conflicts (queue:* vs mesh:*).

Horizontal Scaling

Multiple servers can push to the same queue. Redis coordinates via atomic operations - no duplicate processing.

Cleanup

await queue.close()

License

MIT