npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

promise_perf

v1.0.2

Published

The Promise Perf library allows you to instrument a Promise chain

Downloads

7

Readme

CICD TypeScript

Promise Performance Monitoring

The Promise Perf library allows you to instrument a Promise chain - any sequence of calls that start with an await or new Promise - to get reporting about where time was spent in that chain.

The library is extremely useful when tracing out HTTP calls to determine which operations were the most expensive and best targets for adding caching or any complicated long-duration operation that is not computationally bound (where traditional performance analysis tools would shine) but instead, spend a lot of time waiting on external services.

Features

Promise Perf was written to satisfy a variety of needs when developing async based systems bound more by network io or remote service performance than computational taskloads. As such, it supports:

  • Capturing traces from Promise hierarchies via PromisePerf.
  • Capturing traces from async-based Express handler chains via ExpressPerf.
  • Leveraging source-map, if installed, to translate compiled JavaScript with .js.map files to their original .ts files.
  • Pretty-printing annotated source files when tracing specific requirements.

How to use it

Start by installing the promise_perf package:

npm install promise_perf

Then, import the package in to the top level Promise or async function you wish to measure:

import { IHookRecords, ExpressPerf, annotateSource, fileToString } from 'promise_perf';

Wrap the invocation of an asynchronous function within a PromisePerf function, and parse the resulting records as desired.

await PromisePerf.trace(yourAsyncFunction, (rec: IHookRecords) => (records = rec));

Or add it as a middleware to your Express router:

app.use(
  ExpressPerf.middleware(
    (req: express.Request) => true,
    (req: express.Request, res: express.Response, rec: IHookRecords) => (records = rec)
  )
);

All Promises created during the invocation of that function will be tracked in the rec object, passed to the supplied callback on completion of the function. This can then be passed to several utility functions to provide a pretty annotated output of one or more desired source files:

console.log(
  fileToString(
    Object.values(
      await annotateSource(records, { pathFilter: (path) => path.endsWith('basic.js') })
    )[0]
  )

How it works

Promise Perf uses both async_hook as well as AsyncLocalStorage to track the lifespan of individual Promise objects within a particular Promise chain.

Additionally, source-map is used to optionally support pretty-printing output based on TypeScript .js.map files.

As a package, Promise Perf has no required dependencies.

Examples

Simple Timeout

Let's start with a simple test in basic.js that includes an setTimeout, as a proxy for some other piece of work:

const testFunc = async () => {
  await new Promise((resolve) =>
    setTimeout(async () => {
      resolve(0);
    }, 10)
  );
};

exports.testFunc = testFunc;

We can instrument the invocation of testFunc by wrapping it in a PromisePerf object:

import { IHookRecords, PromisePerf, annotateSource, fileToString } from 'promise_perf';

let records;
await PromisePerf.trace(testFunc, (rec: IHookRecords) => (records = rec));

When the function completes and the testFunc promise resolves, the supplied callback will be invoked with the performance records as the parameter. We can save those to a local variable records and then print out the results for just the basic.js file:

console.log(
  fileToString(
    Object.values(
      await annotateSource(records, { pathFilter: (path) => path.endsWith('basic.js') })
    )[0]
  )

This produces the following annotated output, where the number to the left is the duration in ms, and the line of text corresponds to the line of code:

 11 |const testFunc = async () => {
 11 |  await new Promise((resolve) =>
    |    setTimeout(async () => {
    |      resolve(0);
    |    }, 10)
    |  );
    |};
    |
    |exports.testFunc = testFunc;
    |

Express Endpoint

Measuring the output and identifying hot spots or opportunities for caching in an express endpoint was one of the key values around building this - what was most of the time being spent on while servicing a request?

Using a simple express router, let's see what Promise Perf can tell show:

const router = express.Router();

const work = async () => {
  await superagent.get('https://www.google.com');
  await superagent.get('https://www.amazon.com');
  await superagent.get('https://www.apple.com');
};

router.get('/', async (req, res) => {
  await new Promise((resolve) =>
    setTimeout(async () => {
      resolve(0);
    }, 100)
  );
  await work();
  await new Promise((resolve) =>
    setTimeout(async () => {
      resolve(0);
    }, 100)
  );
  res.send('All done');
});

exports.router = router;

Let's then add the Express middleware to the handling chain. The first parameter is a function to indicate whether or not this request should be traced (allowing for easy enabling/disabling based on request parameters, for example), while the second handles the created records.

import { IHookRecords, ExpressPerf, annotateSource, fileToString } from 'promise_perf';

app.use(
  ExpressPerf.middleware(
    (req: express.Request) => true,
    (req: express.Request, res: express.Response, rec: IHookRecords) => (records = rec)
  )
);
app.use(router);

Then, using the same code as above, we can output the results of the operation when records is set:

     |const express = require('express');
     |const superagent = require('superagent');
     |
     |const router = express.Router();
     |
 646 |const work = async () => {
 157 |  await superagent.get('https://www.google.com');
 156 |  await superagent.get('https://www.amazon.com');
 343 |  await superagent.get('https://www.apple.com');
     |};
     |
     |router.get('/', async (req, res) => {
 202 |  await new Promise((resolve) =>
 101 |    setTimeout(async () => {
     |      resolve(0);
     |    }, 100)
     |  );
 647 |  await work();
 101 |  await new Promise((resolve) =>
 100 |    setTimeout(async () => {
     |      resolve(0);
     |    }, 100)
     |  );
     |  res.send('All done');
     |});
     |
     |exports.router = router;
     |

This allows us to easily trace out where the hotspots are, and see what the relative costs of the various pieces of work involved might be.