npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

lambda-invoke-queue

v2.1.0

Published

Execute background jobs in AWS Lambda after response using Lambda Extensions

Downloads

190

Readme

Lambda Invoke Queue

Execute background jobs in AWS Lambda after sending the response using Lambda Extensions. Perfect for flushing telemetry without impacting response time.

Installation

npm install lambda-invoke-queue

Quick Start

import BackgroundJobs from 'lambda-invoke-queue';

export const handler = async (event, context) => {
  const jobs = new BackgroundJobs(context);
  
  // Your business logic
  const result = await processRequest(event);
  
  // Add jobs to run after response
  jobs
    .add(async () => {
      await langfuse.flush();
      console.log('Langfuse flushed');
    })
    .add(async () => {
      await otel.flush();
      console.log('OpenTelemetry flushed');
    });
  
  // Signal that handler has ended
  jobs.end();
  
  // Return immediately - jobs will run in background
  return {
    statusCode: 200,
    body: JSON.stringify(result)
  };
};

Docker Setup

Add to your Dockerfile after npm install:

COPY node_modules/lambda-invoke-queue/extension/lambda-invoke-queue /opt/extensions/
RUN chmod +x /opt/extensions/lambda-invoke-queue

Complete Dockerfile Example

FROM public.ecr.aws/lambda/nodejs:20

WORKDIR ${LAMBDA_TASK_ROOT}
COPY package*.json ./
RUN npm ci --omit=dev

COPY . .

# Setup Lambda Extension Background
COPY node_modules/lambda-invoke-queue/extension/lambda-invoke-queue /opt/extensions/
RUN chmod +x /opt/extensions/lambda-invoke-queue

CMD ["index.handler"]

How it Works

  1. Handler creates BackgroundJobs instance and adds jobs with .add()
  2. Handler calls .end() to signal completion (required)
  3. Your response is sent immediately (non-blocking)
  4. Extension detects the ready signal and triggers job execution
  5. Jobs execute after response in the same container
  6. Lambda freezes after jobs complete

Security Features

  • Path injection protection: Request IDs validated as UUID format only
  • Memory leak prevention: Automatic job cleanup after 5 minutes
  • Input validation: All file operations use sanitized paths
  • Production-ready: Enterprise-grade security built-in

Common Use Cases

Telemetry Flushing

const jobs = new BackgroundJobs(context);
jobs.add(async () => {
  await Promise.all([
    langfuse.flush(),
    posthog.flush(), 
    datadog.flush()
  ]);
  console.log('All telemetry flushed');
});
jobs.end();

Cleanup Tasks

const jobs = new BackgroundJobs(context);
jobs
  .add(async () => await cleanupTempFiles())
  .add(async () => await closeConnections());
jobs.end();

Error Handling

const jobs = new BackgroundJobs(context);
jobs.add(async () => {
  try {
    await riskyOperation();
  } catch (error) {
    console.error('Background job failed:', error);
    // Job failures don't affect the response
  }
});
jobs.end();

Streaming Response

import { streamifyResponse } from 'lambda-stream';
import BackgroundJobs from 'lambda-invoke-queue';

export const handler = streamifyResponse(async (event, responseStream, context) => {
  const jobs = new BackgroundJobs(context);
  
  // Stream response chunks
  responseStream.write('Starting processing...\n');
  await processFirstBatch();
  responseStream.write('Batch 1 complete...\n');
  await processSecondBatch();
  responseStream.write('Batch 2 complete...\n');
  
  // Add background jobs
  jobs.add(async () => {
    await telemetry.flush();
    console.log('Telemetry flushed after streaming');
  });
  
  // End streaming
  responseStream.end('Processing complete!\n');
  
  // Signal handler completion - jobs will run after stream ends
  jobs.end();
});

API Reference

BackgroundJobs

Class for managing background jobs using Lambda Extensions.

Constructor

new BackgroundJobs(context)
  • context: AWS Lambda context object (required)

Methods

  • add(fn): Add an async function to execute in background. Returns this for chaining.
  • end(): Signal that handler has ended. Must be called to trigger job execution.

Performance

  • Response Time: Only your handler logic (jobs don't block)
  • Total Duration: Handler time + job execution time
  • CloudWatch Metrics:
    • Duration: Total time including jobs
    • PostRuntimeExtensionsDuration: Time spent on jobs

Debug Logging

Enable debug logging with environment variable:

DEBUG=true

Shows internal processing, job execution, and timing information.

TypeScript Support

Full TypeScript definitions included:

import BackgroundJobs from 'lambda-invoke-queue';
import type { LambdaContext } from 'lambda-invoke-queue';

export const handler = async (event: any, context: LambdaContext) => {
  const jobs = new BackgroundJobs(context);
  
  // Business logic
  const result = await processEvent(event);
  
  // Add background jobs
  jobs.add(async () => {
    await telemetry.flush();
    console.log('Telemetry flushed successfully');
  });
  
  // Signal handler completion
  jobs.end();
  
  // Return immediately - jobs run in background
  return { statusCode: 200, body: JSON.stringify(result) };
};

Important Notes

  • Non-blocking: Jobs never impact response time to client
  • Same process: Jobs run in handler process with access to all objects
  • Total duration: Job execution time adds to Lambda duration metrics
  • Request isolation: Each request's jobs are isolated by requestId
  • Context required: Always pass Lambda context for proper isolation
  • Auto-cleanup: Memory leaks prevented with 5-minute job timeout
  • Zero dependencies: Ultra-lightweight and secure

Testing

npm test

License

MIT