npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

forgeflow

v0.6.0

Published

TypeScript-first CI/CD pipeline compiler for multiple providers.

Readme

Forgeflow

Forgeflow is a TypeScript-first CI/CD pipeline compiler. You author pipelines as plain objects, validate them against Zod-backed contracts, and render them to multiple CI providers.

Forgeflow includes renderers for:

  • GitHub Actions
  • Bitbucket Pipelines
  • GitLab CI

Forgeflow is intentionally compiler-shaped: author -> normalize -> analyze capabilities -> lower to provider YAML.

Why Forgeflow

  • Type-safe pipeline authoring with a plain object DSL
  • Mandatory provider selection with provider-scoped autocomplete for extensions and raw escape hatches
  • Zod-backed env and secret contracts
  • Typed schema refs and job dependency autocomplete via defineVariables() and defineJobs()
  • Portable expressions for env, secrets, matrix values, branches, tags, and conditions
  • First-class cache, artifact, and job output support in the IR
  • Provider capability analysis with explicit diagnostics instead of silent degradation
  • Structured error classes for render, validation, module loading, and CLI failures
  • Bundled examples and a consumer CLI smoke test

Source modules

  • forgeflow/core - pipeline IR, object DSL, expressions, diagnostics, and rendering APIs
  • forgeflow/renderers/github - GitHub Actions renderer
  • forgeflow/renderers/bitbucket - Bitbucket Pipelines renderer
  • forgeflow/renderers/gitlab - GitLab CI renderer
  • forgeflow/presets - portable helpers like checkout(), setupNode(), cache helpers, Node and Python pipeline presets, and Bitbucket helpers like bitbucketClone(), bitbucketCacheDefinition(), bitbucketOnFail(), and bitbucketAfterScript()
  • forgeflow/examples - packaged pipeline examples
  • forgeflow/cli - forgeflow render, check, explain, portability, init, and migrate

The project now lives in a single TypeScript repo under src/:

  • src/core
  • src/renderers/github
  • src/renderers/bitbucket
  • src/renderers/gitlab
  • src/presets
  • src/examples
  • src/cli

Quick start

pnpm install
pnpm build
pnpm test
pnpm test:e2e

Forgeflow now uses a single-package TypeScript layout with subpath exports and TS path aliases for clean imports.

Author a pipeline

import {
  cmd,
  ForgeflowAggregateError,
  defineJobs,
  defineVariables,
  expr,
  manual,
  pipeline,
  pullRequest,
  push,
  render,
  uses
} from "forgeflow/core"
import {
  checkout,
  restoreNodeCache,
  saveNodeCache,
  setupNode,
  uploadArtifact
} from "forgeflow/presets"
import { z } from "zod"

const variables = defineVariables({
  env: z.object({
    CI: z.boolean(),
    NPM_CACHE_DIR: z.string(),
    RELEASE_CHANNEL: z.enum(["stable", "canary"]),
    RELEASE_VERSION: z.string(),
    NODE_AUTH_TOKEN: z.string()
  }),
  secrets: z.object({
    NPM_TOKEN: z.string()
  })
})

const ci = pipeline({
  provider: "github",
  name: "node-ci",
  triggers: [push(["main"]), pullRequest(["main"]), manual()],
  variables,
  env: {
    CI: true,
    NPM_CACHE_DIR: ".npm",
    RELEASE_CHANNEL: "stable"
  },
  jobs: defineJobs("github", variables, {
    build: ({ vars, secrets }) => ({
      runsOn: "ubuntu-latest",
      if: expr.eq(vars.RELEASE_CHANNEL, "stable"),
      strategy: {
        matrix: {
          values: {
            node: [18, 20, 22]
          }
        }
      },
      env: {
        NPM_CACHE_DIR: vars.NPM_CACHE_DIR,
        NODE_AUTH_TOKEN: secrets.NPM_TOKEN
      },
      outputs: {
        node_version: { run: 'node -p "process.version"' }
      },
      steps: [
        uses(checkout()),
        uses(setupNode({ version: expr.matrix("node") })),
        restoreNodeCache({
          key: "npm-cache",
          packageManager: "npm",
          path: vars.NPM_CACHE_DIR
        }),
        {
          kind: "run",
          name: "Install",
          run: [cmd`npm ci --cache "${vars.NPM_CACHE_DIR}"`]
        },
        { kind: "run", name: "Test", run: ["npm test"] },
        saveNodeCache({
          key: "npm-cache",
          packageManager: "npm",
          path: vars.NPM_CACHE_DIR
        }),
        uploadArtifact({ name: "coverage", path: "coverage" })
      ]
    }),
    publish: ({ needs, vars }) => ({
      runsOn: "ubuntu-latest",
      needs: [needs.build],
      env: {
        RELEASE_VERSION: expr.jobOutput(needs.build, "node_version")
      },
      steps: [
        uses(checkout()),
        {
          kind: "run",
          name: "Publish summary",
          run: [
            cmd`echo "publishing ${vars.RELEASE_VERSION} from ${vars.RELEASE_CHANNEL}"`
          ]
        }
      ]
    })
  })
})

try {
  const result = render(ci)
  console.log(result.output)
  console.log(result.diagnostics)
} catch (error) {
  if (error instanceof ForgeflowAggregateError) {
    console.error(error.message)
    console.error(error.diagnostics)
  } else {
    throw error
  }
}

Variable contracts

Use defineVariables() with pipeline({ provider, variables }) to declare env and secret contracts.

  • use variables.refs.env.NAME for typed env references
  • use variables.refs.secrets.NAME for typed secret references
  • use defineJobs("github", variables, { publish: ({ needs }) => ... }) to scope typed jobs to a provider while keeping typed needs.* autocomplete
  • use expr.env("NAME") and expr.secret("NAME") when the name is dynamic or not schema-backed
  • use expr.matrix("axis") for matrix references
  • use expr.jobOutput("job", "output") for upstream outputs
  • use cmd\...`inside shell commands so refs stay typed instead of hardcoded as raw$NAME` strings
import { cmd, defineJobs, defineVariables, pipeline } from "forgeflow/core"
import { z } from "zod"

const variables = defineVariables({
  env: z.object({
    REGISTRY_IMAGE: z.string(),
    RELEASE_CHANNEL: z.enum(["stable", "canary"]),
    NODE_AUTH_TOKEN: z.string()
  }),
  secrets: z.object({
    NPM_TOKEN: z.string()
  })
})

const release = pipeline({
  provider: "github",
  name: "release",
  variables,
  env: {
    REGISTRY_IMAGE: "ghcr.io/acme/forgeflow",
    RELEASE_CHANNEL: "stable",
    NODE_AUTH_TOKEN: variables.refs.secrets.NPM_TOKEN
  },
  jobs: defineJobs("github", variables, {
    build: {
      runsOn: "ubuntu-latest",
      steps: [{ kind: "run", run: ["npm run build"] }]
    },
    publish: ({ needs, vars, secrets }) => ({
      runsOn: "ubuntu-latest",
      needs: [needs.build],
      steps: [
        {
          kind: "run",
          name: "Publish",
          run: [
            cmd`pnpm publish --tag "${vars.RELEASE_CHANNEL}" && echo "${vars.REGISTRY_IMAGE}"`
          ],
          env: {
            NODE_AUTH_TOKEN: secrets.NPM_TOKEN
          }
        }
      ]
    })
  })
})

Object DSL

Forgeflow now standardizes on the object DSL. Pipelines are plain data plus helper constructors like job(...), run(...), cmd\...`, uses(...), step.*, and defineJobs(provider, ...)when you want typedneeds` refs plus provider-scoped autocomplete.

import { defineJobs, pipeline, push, run } from "forgeflow/core"

const ci = pipeline({
  provider: "github",
  name: "node-ci",
  triggers: [push(["main"])],
  jobs: defineJobs("github", {
    test: {
      runsOn: "ubuntu-latest",
      steps: [run("npm ci"), run("npm test")]
    },
    package: ({ needs }) => ({
      runsOn: "ubuntu-latest",
      needs: [needs.test],
      steps: [run("npm pack")]
    })
  })
})

Bitbucket helpers

Bitbucket-specific step and pipeline fields can stay typed too via forgeflow/presets helpers instead of ad hoc raw payloads. These helpers compile to Bitbucket-only extensions, so they are ignored when you explicitly render the same pipeline to another provider. Use step.raw(...) only when you want a hard provider-specific escape hatch. Available helpers include bitbucketClone(), bitbucketOptions(), bitbucketCacheDefinition(), bitbucketCaches(), bitbucketDeployment(), bitbucketSize(), bitbucketChangesetsCondition(), bitbucketOnFail(), bitbucketFailFast(), bitbucketManualTrigger(), bitbucketImage(), and bitbucketAfterScript().

import {
  cmd,
  defineJobs,
  defineVariables,
  pipeline,
  push,
  run
} from "forgeflow/core"
import {
  bitbucketAfterScript,
  bitbucketCacheDefinition,
  bitbucketCaches,
  bitbucketClone,
  bitbucketDeployment,
  bitbucketOnFail,
  bitbucketSize
} from "forgeflow/presets"
import { z } from "zod"

const variables = defineVariables({
  env: z.object({
    DEPLOY_ENV: z.enum(["staging", "production"])
  })
})

const deploy = pipeline({
  provider: "bitbucket",
  name: "deploy",
  triggers: [push(["main"])],
  variables,
  env: {
    DEPLOY_ENV: "staging"
  },
  extensions: [
    bitbucketClone({ depth: "full", lfs: true }),
    bitbucketCacheDefinition("pnpm-store", {
      path: ".pnpm-store",
      keyFiles: ["pnpm-lock.yaml"]
    })
  ],
  jobs: defineJobs("bitbucket", variables, {
    release: ({ vars }) => ({
      env: {
        DEPLOY_ENV: vars.DEPLOY_ENV
      },
      extensions: [
        bitbucketCaches(["pnpm-store"]),
        bitbucketDeployment("staging"),
        bitbucketSize("2x"),
        bitbucketOnFail({ strategy: "retry", maxRetryCount: 2 }),
        bitbucketAfterScript(cmd`echo "cleanup for ${vars.DEPLOY_ENV}"`)
      ],
      steps: [run(cmd`pnpm deploy -- --env "${vars.DEPLOY_ENV}"`)]
    })
  })
})

Advanced examples

For fuller end-to-end examples, see src/examples. Two representative patterns are below.

Multi-job release with artifacts and job outputs

import {
  cmd,
  defineJobs,
  defineVariables,
  expr,
  manual,
  pipeline,
  push,
  run,
  step,
  uses
} from "forgeflow/core"
import { checkout, setupNode } from "forgeflow/presets"
import { z } from "zod"

const variables = defineVariables({
  env: z.object({
    PACKAGE_VERSION: z.string(),
    RELEASE_CHANNEL: z.enum(["stable", "canary"]),
    NODE_AUTH_TOKEN: z.string()
  }),
  secrets: z.object({
    NPM_TOKEN: z.string()
  })
})

const release = pipeline({
  provider: "github",
  name: "package-release",
  triggers: [push(["main"]), manual()],
  variables,
  env: {
    RELEASE_CHANNEL: "stable",
    NODE_AUTH_TOKEN: variables.refs.secrets.NPM_TOKEN
  },
  jobs: defineJobs("github", variables, {
    build: {
      runsOn: "ubuntu-latest",
      outputs: {
        version: { run: 'node -p "require(\"./package.json\").version"' }
      },
      steps: [
        uses(checkout()),
        uses(setupNode({ version: 20, cache: "npm" })),
        run("Install", "npm ci"),
        run("Build", "npm run build"),
        step.uploadArtifact("package-dist", "dist")
      ]
    },
    publish: ({ needs, secrets, vars }) => ({
      runsOn: "ubuntu-latest",
      needs: [needs.build],
      if: expr.eq(expr.branch(), "main"),
      env: {
        NODE_AUTH_TOKEN: secrets.NPM_TOKEN,
        PACKAGE_VERSION: expr.jobOutput(needs.build, "version")
      },
      steps: [
        step.downloadArtifact("package-dist", { fromJob: needs.build }),
        run(
          "Summary",
          cmd`echo "publishing ${vars.PACKAGE_VERSION} to ${vars.RELEASE_CHANNEL}"`
        ),
        run("Publish", cmd`pnpm publish --tag "${vars.RELEASE_CHANNEL}"`)
      ]
    })
  })
})

Scheduled integration pipeline with services and provider extensions

import {
  cmd,
  defineJobs,
  defineVariables,
  githubExt,
  manual,
  pipeline,
  pullRequest,
  push,
  run,
  schedule,
  uses
} from "forgeflow/core"
import { checkout, setupNode } from "forgeflow/presets"
import { z } from "zod"

const variables = defineVariables({
  env: z.object({
    DATABASE_URL: z.string(),
    INTEGRATION_REPORTER: z.string()
  })
})

const integration = pipeline({
  provider: "github",
  name: "integration-suite",
  triggers: [
    push(["main"]),
    pullRequest(["main"]),
    schedule("0 3 * * *"),
    manual()
  ],
  variables,
  env: {
    INTEGRATION_REPORTER: "dot"
  },
  jobs: defineJobs("github", variables, {
    integration: ({ vars }) => ({
      runsOn: "ubuntu-latest",
      env: {
        DATABASE_URL: "postgresql://postgres:postgres@localhost:5432/app",
        INTEGRATION_REPORTER: vars.INTEGRATION_REPORTER
      },
      services: [
        {
          name: "postgres",
          image: "postgres:16",
          env: {
            POSTGRES_DB: "app",
            POSTGRES_USER: "postgres",
            POSTGRES_PASSWORD: "postgres"
          },
          ports: [5432]
        }
      ],
      steps: [
        uses(checkout()),
        uses(setupNode({ version: 20, cache: "pnpm" })),
        run("Install", "pnpm install --frozen-lockfile"),
        run(
          "Integration tests",
          cmd`pnpm test:integration -- --reporter="${vars.INTEGRATION_REPORTER}"`
        )
      ],
      extensions: [
        githubExt({
          concurrency: {
            group: "integration-${{ github.ref_name }}",
            "cancel-in-progress": true
          }
        })
      ]
    })
  })
})

Multiple pipelines

Yes. A repo can define multiple pipeline(...) objects and export them as a manifest.

The CLI now batches render, check, explain, and portability across every exported pipeline in a file. A common pattern is one pipeline for pull requests and one for main:

// forgeflow.ts
import {
  cmd,
  defineJobs,
  defineVariables,
  manual,
  pipeline,
  pullRequest,
  push,
  run,
  uses
} from "forgeflow/core"
import { checkout, setupNode } from "forgeflow/presets"
import { z } from "zod"

const variables = defineVariables({
  env: z.object({
    RELEASE_CHANNEL: z.enum(["stable", "canary"]),
    TEST_REPORTER: z.string()
  })
})

export const pullRequests = pipeline({
  provider: "github",
  name: "pull-request-checks",
  triggers: [pullRequest(["main"])],
  variables,
  env: {
    RELEASE_CHANNEL: "stable",
    TEST_REPORTER: "dot"
  },
  jobs: defineJobs("github", variables, {
    validate: ({ vars }) => ({
      runsOn: "ubuntu-latest",
      steps: [
        uses(checkout()),
        uses(setupNode({ version: 20, cache: "npm" })),
        run("npm ci"),
        run("Test", cmd`npm test -- --reporter "${vars.TEST_REPORTER}"`)
      ]
    })
  })
})

export const main = pipeline({
  provider: "github",
  name: "main-branch-release",
  triggers: [push(["main"]), manual()],
  variables,
  env: {
    RELEASE_CHANNEL: "stable",
    TEST_REPORTER: "dot"
  },
  jobs: defineJobs("github", variables, {
    release: ({ vars }) => ({
      runsOn: "ubuntu-latest",
      steps: [
        uses(checkout()),
        uses(setupNode({ version: 20, cache: "npm" })),
        run("npm ci"),
        run(
          "Publish dry run",
          cmd`pnpm publish --tag "${vars.RELEASE_CHANNEL}" --dry-run`
        )
      ]
    })
  })
})

export const pipelines = { pullRequests, main }
export default pipelines
forgeflow render ./forgeflow.ts --out ./.forgeflow-out
forgeflow check ./forgeflow.ts
forgeflow portability ./forgeflow.ts

Packaged split-pipeline examples are available in:

  • src/examples/multi-pipeline/pull-requests.ts
  • src/examples/multi-pipeline/main.ts

Rendering and validation

render() and validate() now throw structured errors when they encounter error diagnostics.

Because pipeline.provider is required, it scopes provider-specific authoring and lets render(pipeline) and CLI commands like render and explain infer a default renderer automatically. You can still pass an explicit renderer when you want to check or render the same pipeline against another target.

  • ForgeflowAggregateError - one or more compiler or lowering failures
  • PipelineValidationError - invalid pipeline shape or contracts
  • RendererCapabilityError - target cannot support a requested feature
  • ExpressionRenderError - expression cannot be lowered for a provider or context
  • RendererLoweringError - renderer cannot lower a specific construct

Rendering is fail-closed: if any error-level diagnostics are produced, Forgeflow throws instead of returning YAML.

Warnings and info diagnostics are still returned on successful renders.

Renderer capability overview

| Capability | GitHub | Bitbucket | GitLab | | ---------------- | ------ | -------------------------- | ------ | | Matrix | Yes | No | Yes | | Services | Yes | Yes | Yes | | Cache steps | Yes | Best effort | Yes | | Artifact steps | Yes | Partial | Yes | | Job dependencies | Yes | No | Yes | | Job outputs | Yes | No | Yes | | Manual triggers | Yes | Yes | Yes | | Schedules | Yes | UI-backed custom pipelines | Yes | | Job conditions | Yes | No | Yes | | Step conditions | Yes | No | No |

See the renderer implementations in src/renderers/github, src/renderers/bitbucket, and src/renderers/gitlab for provider-specific details.

CLI

forgeflow --help
forgeflow render --help
forgeflow help render
forgeflow init node-ci --provider github --out ./forgeflow.ts
forgeflow init --template python-ci --provider gitlab --out ./forgeflow.ts
forgeflow migrate .github/workflows/ci.yml --from github --out ./forgeflow.ts --report ./migration-report.txt
forgeflow render ./src/examples/github-release/automation.ts
forgeflow render ./src/examples/multi-pipeline/main.ts --json
forgeflow explain ./src/examples/multi-pipeline/pull-requests.ts
forgeflow portability ./src/examples/node-ci/matrix.ts --json
forgeflow render ./src/examples/node-ci/portable.ts --target github
forgeflow render ./src/examples/node-ci/portable.ts --target gitlab --out ./.gitlab-ci.yml
forgeflow check ./src/examples/monorepo/affected.ts
forgeflow explain ./src/examples/node-ci/matrix.ts --target bitbucket

CLI failures are also structured in --json mode, while human mode shows Commander-powered command help:

  • invalid invocation -> CliUsageError
  • pipeline import/export failures -> PipelineModuleLoadError
  • render and validation failures -> ForgeflowAggregateError

--json is available on render, check, explain, portability, init, and migrate so diagnostics and reports can feed CI, bots, and editor tooling.

Every command now exposes built-in help through forgeflow <command> --help, the root entrypoint exposes forgeflow --help plus forgeflow --version, and Commander also provides forgeflow help <command>.

In human-readable mode:

  • Forgeflow uses semantic terminal colors for headings, statuses, success messages, and diagnostics when writing to a TTY; --json output and redirected output stay plain
  • Help output uses a restrained command/argument/option hierarchy inspired by modern developer CLIs; set NO_COLOR=1 to force plain text
  • check, explain, and portability print compact summaries to stdout and send detailed diagnostics to stderr when present
  • migrate prints a compact success summary and a structured migration report on stderr when the importer records follow-up issues
  • render prints YAML to stdout; when rendering multiple pipelines without --out, each document is prefixed with YAML comment headers for the pipeline and target and separated with ---

Scaffolding and migration

  • forgeflow init creates a provider-aware starter file from templates like node-ci, python-ci, package-release, service-deploy, monorepo, and multi-pipeline; you can pass the template positionally or with --template <name>
  • forgeflow migrate converts provider YAML into a best-effort Forgeflow module and emits a migration report for anything preserved raw or not mapped cleanly yet
  • current migration sources: GitHub Actions, Bitbucket Pipelines, and GitLab CI
  • forgeflow portability runs capability analysis against GitHub, Bitbucket, and GitLab in one command

Bundled examples

Examples live under src/examples and are also importable from forgeflow/examples.

  • src/examples/node-ci/portable.ts - minimal portable Node.js CI
  • src/examples/node-ci/matrix.ts - matrix-based Node.js CI
  • src/examples/python-ci/portable.ts - minimal portable Python CI
  • src/examples/pnpm-package/ci.ts - preset-first pnpm package CI
  • src/examples/bitbucket/service-deploy.ts - Bitbucket-focused deployment pipeline using helper presets for clone, custom caches, deployment, retry policy, changesets conditions, and after-script
  • src/examples/object-dsl/portable.ts - portable object-DSL pipeline with caches, artifacts, and provider extensions
  • src/examples/docker-release/release.ts - multi-job Docker release with caches, artifacts, and outputs
  • src/examples/monorepo/affected.ts - monorepo pipeline with schedules, services, artifacts, outputs, and provider extensions
  • src/examples/release/conditional.ts - conditional release pipeline with job outputs and step conditions
  • src/examples/github-release/automation.ts - GitHub-focused release automation using raw escape hatches
  • src/examples/multi-pipeline/pull-requests.ts - pull-request-only pipeline for multi-pipeline repos
  • src/examples/multi-pipeline/main.ts - main-branch pipeline for multi-pipeline repos
import {
  bitbucketServiceDeployExample,
  conditionalReleaseExample,
  mainBranchReleaseExample,
  monorepoAffectedExample,
  objectDslPortable,
  portableNodeCi,
  pullRequestChecksExample
} from "forgeflow/examples"

Development and release flow

  • pnpm build - builds the single-package source tree into dist
  • pnpm test - runs the fast unit and contract suites
  • pnpm test:e2e - runs the heavier example-corpus and published-consumer end-to-end suites
  • pnpm typecheck - runs full TypeScript checking across src and tests
  • pnpm pack / pnpm release - automatically rebuild dist before packaging
  • pnpm changeset - creates a release changeset
  • pnpm version-packages - applies Changesets versions
  • pnpm release - publishes the root forgeflow package through Changesets

CI automation lives in:

  • .github/workflows/ci.yml

Releases are manual. A typical release flow is:

  1. run pnpm build && pnpm test && pnpm test:e2e
  2. create or review a changeset with pnpm changeset
  3. version packages with pnpm version-packages
  4. publish with pnpm release

The project includes broader end-to-end coverage:

  • tests/e2e/providers/*.test.ts render the packaged example corpus with separate provider-focused suites for GitHub, Bitbucket, and GitLab
  • tests/e2e/cli/smoke.test.ts packs the root package, installs it into a temporary consumer project, and verifies forgeflow init, migrate, render, check, explain, and portability