npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

auto-builder-nodes

v1.0.0

Published

To make it easy for you to get started with GitLab, here's a list of recommended next steps.

Readme

Building nodes using Auto-Builder SDK

Lightweight TypeScript toolkit for authoring custom Auto-Builder workflow nodes (plugins).

Features

  • 🔒 Sandbox-first execution – nodes run inside a VM2 sandbox by default.
  • 🧩 Tiny surface: just extend BaseNodeExecutor and export it.
  • 📊 Built-in telemetry & test/coverage enforcement.
  • 🛠️ npx auto-builder-sdk init <my-plugin> scaffolds a ready-to-run plugin.

Quick start

# create plugin folder & scaffold files
npx auto-builder-sdk init auto-builder-deskera-integration
cd my-awesome-plugin

# develop
npm run dev        # watch mode (tsc)

# run unit tests (required ≥80 % coverage)
npm test

# publish to npm (prepublishOnly → tests+coverage)
npm publish --access public

Scaffold output (SDK ≥ 1.0.9)

The generated package.json now includes:

{
  "engines": { "auto-builder": "^1.0.0", "node": ">=18" },
  "dependencies": { "auto-builder-sdk": "^1.0.9" },
  "devDependencies": {
    "@types/node": "^24.0.7",
    "typescript": "^5.8.3",
    "vitest": "^1.6.1",
    "@vitest/coverage-v8": "^1.6.1"
  }
}

Other template tweaks:

  • vitest.config.ts marks the SDK as external so esbuild keeps it as ESM.
  • tsconfig.json includes "types": ["node"] so the CommonJS stub compiles cleanly.
  • Sample src/index.ts uses definePlugin() and a full static readonly definition block – it compiles and runs out-of-the-box.

Upgrade by simply reinstalling the latest SDK:

npm i auto-builder-sdk@latest -g   # or use npx for each run

Anatomy of a plugin

my-awesome-plugin/
 ├─ src/
 │   └─ index.ts          # exports your node classes
 ├─ package.json          # flagged with "auto-builder-plugin": true
 └─ tsconfig.json

Plugin manifest (definePlugin) & node metadata

Each SDK project exports one default plugin manifest created with definePlugin(). The object is validated at build-time by Zod so you can't accidentally publish an invalid plugin.

export default definePlugin({
  name: 'acme-erp',              // npm name or folder name
  version: '1.2.3',
  main: './dist/index.js',       // compiled entry file
  nodes: ['CreateOrder', 'Ping'],
  engines: { 'auto-builder': '^1.0.0' },
  sandbox: { timeoutMs: 20_000 }, // optional overrides
});

Inside a node you can expose extra metadata used by the builder UI:

static readonly definition = {
  displayName: 'Create Order',          // shown in node palette
  icon: 'shopping-cart',                // lucide icon name
  group: ['action'],
  version: 1,
  description: 'Create a new order in Acme ERP',
  category: 'action',
  inputs: ['main'], outputs: ['main'],
  properties: [ /* … */ ],
};

Only properties is mandatory — everything else falls back to reasonable defaults if omitted.

Minimal node executor

import { BaseNodeExecutor, definePlugin } from 'auto-builder-sdk';

export class HelloNode extends BaseNodeExecutor {
  static type = 'hello.world';   // machine id (shown in UI)
  readonly nodeType = HelloNode.type;

  async execute(node, input, ctx) {
    return [
      { json: { message: `Hello ${ctx.workflowId}!` }, binary: {} },
    ];
  }
}

export default definePlugin({
  name: 'my-awesome-plugin',
  version: '0.0.1',
  main: './dist/index.js',
  nodes: ['HelloNode'],
  engines: { 'auto-builder': '^1.0.0' },
});

Dynamic parameter options (dropdown loaders)

Auto-Builder supports dynamic dropdowns in the property inspector.
Define them by adding typeOptions.loadOptionsMethod to a property and implement a static (or instance) method on your executor class.

export class HelloNode extends BaseNodeExecutor {
  static readonly type = 'hello.world';
  readonly nodeType = HelloNode.type;

  // 1️⃣ Loader method – can be async and may use credentials/params
  static async listGreetings(): Promise<Array<{ name: string; value: string }>> {
    return [
      { name: 'Hi',     value: 'hi'     },
      { name: 'Hello',  value: 'hello'  },
      { name: 'Howdy',  value: 'howdy'  },
    ];
  }

  // 2️⃣ Reference the method in the property definition
  static readonly definition = {
    inputs: ['main'], outputs: ['main'],
    properties: [
      {
        displayName: 'Greeting',
        name: 'greeting',
        type: 'options',
        default: 'hello',
        typeOptions: {
          loadOptionsMethod: 'listGreetings',
        },
      },
    ],
  } as const;
}

The engine discovers the method automatically – no server-side changes required.


Debugging a plugin node

  1. Compile with source-maps ("sourceMap": true, "inlineSources": true in tsconfig.json).

  2. Start the backend in debug/watch mode:

    NODE_OPTIONS="--enable-source-maps --inspect=9229" \
    PLUGINS_ENABLED=true \
    PLUGIN_WATCH=true \
    npm run dev
  3. Attach VS Code (Run → "Node.js attach" → localhost:9229). Set break-points in your TypeScript sources – thanks to source-maps they will hit inside the sandbox.

  4. Disable the sandbox temporarily (faster debugging):

    • global: PLUGIN_SAFE_MODE=false npm run dev
    • per plugin: add { "sandbox": { "enabled": false } } in definePlugin().

Logging & error handling

import { log, NodeOperationError, NodeApiError } from 'auto-builder-sdk';

log.info('Fetching data', { url });

if (!apiKey) {
  throw new NodeOperationError(node, 'Missing API key');
}
  • When the plugin runs inside Auto-Builder the SDK logger proxies to the main Winston logger so your messages appear in the service log files and monitoring dashboards.
  • In stand-alone tests the logger falls back to console.*.
  • Throwing either of the SDK error classes lets the engine classify the failure (operation vs API) but is not mandatory – any Error works.

Importing engine types

All public interfaces are bundled with the SDK – no need to install the auto-builder package:

import type { INode, IExecutionContext, INodeExecutionData } from 'auto-builder-sdk';

The file lives at auto-builder-sdk/dist/auto-builder-sdk/src/ab-types.d.ts and is kept in sync with the backend on every release.


Credential definitions (authentication)

Added in SDK 0.1.x – no engine changes required.

Nodes can declare the kind of credential(s) they need via the credential registry. A credential definition is just a JSON-ish object that describes the fields users must enter and an optional validate() function that performs a live API check.

import { registerCredential, type CredentialDefinition } from 'auto-builder-sdk';

const jiraApiToken: CredentialDefinition = {
  name: 'jira',                    // identifier referenced by nodes
  displayName: 'Jira Cloud',
  properties: [
    { name: 'domain',   displayName: 'Domain',   type: 'string',   required: true },
    { name: 'email',    displayName: 'Email',    type: 'string',   required: true },
    { name: 'apiToken', displayName: 'API Token',type: 'password', required: true },
  ],
  validate: async (data) => {
    const { domain, email, apiToken } = data as Record<string,string>;
    const auth = 'Basic ' + Buffer.from(`${email}:${apiToken}`).toString('base64');
    const res  = await fetch(`${domain}/rest/api/3/myself`, { headers:{Authorization:auth} });
    if (!res.ok) throw new Error(`Auth failed (${res.status})`);
  },
};

registerCredential(jiraApiToken);

Expose it on your node:

static readonly definition = {
  // …
  credentials: [ { name: 'jira', required: true } ],
};

Multiple schemes? Just register multiple definitions (e.g. jiraOAuth2, jiraBasic) and list them all in credentials:; the builder UI will let users pick the type they want.

When the node runs you retrieve whatever credential the user selected:

const creds = await this.getCredentials(node.credentials.jira);
console.log(creds.type);  // "jira", "jiraOAuth2", …

No core-or UI-level changes are needed – adding a new credential definition is as simple as shipping the file with your plugin.

Unit test example (Vitest)

import { expect, it } from 'vitest';
import { HelloNode, makeStubContext, makeStubNode } from '../src';

it('returns greeting', async () => {
  const nodeImpl = new HelloNode();
  const ctx = makeStubContext({ workflowId: 'wf-123' });
  const nodeDef = makeStubNode(HelloNode.type);

  const res = await nodeImpl.execute(nodeDef, [], ctx);

  expect(res[0].json.message).toBe('Hello wf-123!');
});

Testing helpers

The SDK exposes two utilities to remove boiler-plate when writing tests:

import { makeStubContext, makeStubNode } from 'auto-builder-sdk';

const ctx = makeStubContext();
const nodeDef = makeStubNode('my.node');

Both helpers accept a partial override so you can customise only the fields you care about.

Coverage & CI script

Every scaffold includes a vitest.config.ts and the matching @vitest/coverage-v8 dev-dependency. Two npm-scripts are generated:

"test":   "vitest",                // watch mode – fast dev loop
"verify": "vitest run --coverage"  // used by prepublishOnly & CI

Security & sandboxing

  • The Auto-Builder engine executes each plugin in a VM2 sandbox, limited by:
    • Time-out (default 30 000 ms)
    • Memory (default 64 MB)
  • Per-plugin overrides via sandbox field:
"sandbox": { "timeoutMs": 10000, "memoryMb": 32 }
  • Global flag PLUGIN_SAFE_MODE=false (engine env) disables sandbox (dev only).

Shared database client (Prisma)

The SDK exposes getDb() which returns the same PrismaClient instance the Auto-Builder backend already uses. That means plugin nodes can run SQL without opening their own connection pools or adding the @prisma/client dependency.

import { getDb, BaseNodeExecutor } from 'auto-builder-sdk';

export class ListUsersNode extends BaseNodeExecutor {
  static readonly type = 'db.users.list';
  readonly nodeType   = ListUsersNode.type;

  async execute() {
    const db = getDb();                   // shared PrismaClient
    const rows = await db.user.findMany();
    return [{ json: rows, binary: {} }];
  }
}

Below are minimal, full-node examples for several popular databases. They all follow the same pattern:

  1. Register a credential definition for the engine.
  2. List that credential type in static definition.credentials.
  3. Inside execute() pick the right driver, feed it the decrypted data and run the query.

PostgreSQL / MySQL (using pg or mysql2)

import { BaseNodeExecutor, registerCredential } from 'auto-builder-sdk';
import { createPool } from 'mysql2/promise';   // or `pg` / `knex`

registerCredential({
  name: 'mysql', displayName: 'MySQL',
  properties: [
    { name: 'host',     displayName: 'Host',     type: 'string',  required: true },
    { name: 'port',     displayName: 'Port',     type: 'number',  required: true, default: 3306 },
    { name: 'database', displayName: 'Database', type: 'string',  required: true },
    { name: 'username', displayName: 'Username', type: 'string',  required: true },
    { name: 'password', displayName: 'Password', type: 'password',required: true },
  ],
});

export class MySqlQueryNode extends BaseNodeExecutor {
  static readonly type = 'mysql.query';  readonly nodeType = MySqlQueryNode.type;

  static readonly definition = {
    credentials: [{ name: 'mysql', required: true }],
    properties: [{ name: 'sql', displayName: 'SQL', type: 'string', required: true }],
  } as const;

  async execute(node) {
    const { sql } = node.parameters as { sql: string };
    const creds = await this.getCredentials(node.credentials.mysql);

    const pool = createPool({
      host: creds.data.host,
      port: creds.data.port,
      user: creds.data.username,
      password: creds.data.password,
      database: creds.data.database,
    });

    const [rows] = await pool.query(sql);
    await pool.end();

    return [{ json: rows, binary: {} }];
  }
}

MongoDB (using mongodb)

import { MongoClient } from 'mongodb';
import { BaseNodeExecutor, registerCredential } from 'auto-builder-sdk';

registerCredential({
  name: 'mongo', displayName: 'MongoDB',
  properties: [
    { name: 'uri',      displayName: 'Connection URI', type: 'string', required: true },
    { name: 'database', displayName: 'Database',       type: 'string', required: true },
  ],
});

export class MongoFindNode extends BaseNodeExecutor {
  static readonly type = 'mongo.find';  readonly nodeType = MongoFindNode.type;
  static readonly definition = {
    credentials: [{ name: 'mongo', required: true }],
    properties: [
      { name: 'collection', displayName: 'Collection', type: 'string', required: true },
      { name: 'query',      displayName: 'Query (JSON)', type: 'string', required: true },
    ],
  } as const;

  async execute(node) {
    const { collection, query } = node.parameters as any;
    const creds = await this.getCredentials(node.credentials.mongo);

    const client = await MongoClient.connect(creds.data.uri);
    const docs = await client.db(creds.data.database)
                          .collection(collection)
                          .find(JSON.parse(query)).toArray();
    await client.close();

    return [{ json: docs, binary: {} }];
  }
}

Microsoft SQL Server (using mssql)

import sql from 'mssql';
import { BaseNodeExecutor, registerCredential } from 'auto-builder-sdk';

registerCredential({
  name: 'mssql', displayName: 'MS SQL Server',
  properties: [
    { name: 'server', displayName: 'Server',   type: 'string', required: true },
    { name: 'user',   displayName: 'User',     type: 'string', required: true },
    { name: 'password',displayName: 'Password',type: 'password',required: true },
    { name: 'database',displayName: 'Database',type: 'string', required: true },
  ],
});

export class MsSqlQueryNode extends BaseNodeExecutor {
  static readonly type = 'mssql.query'; readonly nodeType = MsSqlQueryNode.type;
  static readonly definition = {
    credentials: [{ name: 'mssql', required: true }],
    properties: [{ name: 'sql', displayName: 'SQL', type: 'string', required: true }],
  } as const;

  async execute(node) {
    const { sql: statement } = node.parameters as { sql: string };
    const creds = await this.getCredentials(node.credentials.mssql);

    await sql.connect({
      server: creds.data.server,
      user: creds.data.user,
      password: creds.data.password,
      database: creds.data.database,
      options: { encrypt: true, trustServerCertificate: true },
    });

    const result = await sql.query(statement);
    await sql.close();

    return [{ json: result.recordset, binary: {} }];
  }
}

Other engines / cloud warehouses

The pattern is identical for any future database. Two quick sketches:

ClickHouse (columnar OLAP)

import { createClient } from '@clickhouse/client';
registerCredential({
  name: 'clickhouse', displayName: 'ClickHouse',
  properties: [
    { name: 'url',  displayName: 'HTTP URL', type: 'string',  required: true },
    { name: 'user', displayName: 'User',     type: 'string' },
    { name: 'pass', displayName: 'Password', type: 'password' },
  ],
});

// inside execute()
const ch = createClient({
  host: creds.data.url,
  username: creds.data.user,
  password: creds.data.pass,
});
const rows = await ch.query({ query: sql, format: 'JSONEachRow' });

BigQuery (Google Cloud)

import { BigQuery } from '@google-cloud/bigquery';
registerCredential({
  name: 'bigquery', displayName: 'BigQuery',
  properties: [
    { name: 'projectId', displayName: 'Project ID', type: 'string', required: true },
    { name: 'jsonKey',   displayName: 'Service Account JSON', type: 'string', required: true, typeOptions:{rows:8} },
  ],
});

// inside execute()
const client = new BigQuery({
  projectId: creds.data.projectId,
  credentials: JSON.parse(creds.data.jsonKey),
});
const [rows] = await client.query(sql);

Any engine that reuses the Postgres/MySQL wire-protocol (CockroachDB, TimescaleDB, Aurora-PG/MySQL) can simply adopt the existing credential & node without code changes.

Tip For databases with native Prisma support (PostgreSQL, MySQL, SQLite, SQL Server, MongoDB) you can also generate a separate client in your plugin and skip the manual driver code. Use getDb() when you want to run queries against the engine's primary database.


Sandboxing options per plugin

VM2 limits have sensible defaults (30 s / 64 MB). Override them globally via env-vars or per plugin in the manifest:

{
  "sandbox": {
    "enabled": true,       // false disables VM2 (DEV *only*)
    "timeoutMs": 10000,    // 10 seconds
    "memoryMb": 32         // 32 MB
  }
}

Developers can temporarily set PLUGIN_SAFE_MODE=false on the backend to turn all sandboxes off while debugging.


Telemetry hooks

You can subscribe to runtime metrics:

import { pluginTelemetry } from 'auto-builder/telemetry/plugin-telemetry';

pluginTelemetry.on('metric', (m) => console.log(m));

Publishing & versioning workflow

  1. Bump version (npm version patch|minor|major). Follow semver: • Patch – docs/typos & additive helpers.
    Minor – new capabilities (dynamic loaders, logger).
    Major – breaking API changes.
  2. npm publish --access public (the prepublish script runs tests & coverage).
  3. In your Auto-Builder deployment update the dependency:
npm i auto-builder-sdk@latest   # service repo

Tip: Use Renovate or dependabot so services stay in sync automatically.


Best practices

  • Always write unit tests with ≥80 % coverage – enforced.
  • Validate external inputs with zod inside execute().
  • Keep network/FS access minimal; prefer the SDK's helpers.
  • Publish with semver and respect the peer range in engines.
  • Document node properties with JSDoc – used by the marketplace doc generator.

Resolving parameters – quick recipes

There are two ways to evaluate {{ }} expressions.

  1. Inside your node executor – simply call this.resolveParameters(...).

    export class MyNode extends BaseNodeExecutor {
      readonly nodeType = 'demo.my';
    
      async execute(node, inputs, ctx) {
        const opts = this.resolveParameters(node.parameters, ctx);
        // use opts ...
        return inputs;
      }
    }
  2. Outside executors (tests/helpers) – use ParameterResolver directly.

    import { ParameterResolver } from 'auto-builder-sdk';
    
    const ctx = { /* IExecutionContext stub */ } as any;
    const params = { msg: 'Hello {{ $json.name }}' };
    const out = ParameterResolver.resolve(params, ctx);

Same resolver, two access paths.

License

MIT