forgeflow
v0.6.0
Published
TypeScript-first CI/CD pipeline compiler for multiple providers.
Maintainers
Readme
Forgeflow
Forgeflow is a TypeScript-first CI/CD pipeline compiler. You author pipelines as plain objects, validate them against Zod-backed contracts, and render them to multiple CI providers.
Forgeflow includes renderers for:
- GitHub Actions
- Bitbucket Pipelines
- GitLab CI
Forgeflow is intentionally compiler-shaped: author -> normalize -> analyze capabilities -> lower to provider YAML.
Why Forgeflow
- Type-safe pipeline authoring with a plain object DSL
- Mandatory
providerselection with provider-scoped autocomplete for extensions and raw escape hatches - Zod-backed env and secret contracts
- Typed schema refs and job dependency autocomplete via
defineVariables()anddefineJobs() - Portable expressions for env, secrets, matrix values, branches, tags, and conditions
- First-class cache, artifact, and job output support in the IR
- Provider capability analysis with explicit diagnostics instead of silent degradation
- Structured error classes for render, validation, module loading, and CLI failures
- Bundled examples and a consumer CLI smoke test
Source modules
forgeflow/core- pipeline IR, object DSL, expressions, diagnostics, and rendering APIsforgeflow/renderers/github- GitHub Actions rendererforgeflow/renderers/bitbucket- Bitbucket Pipelines rendererforgeflow/renderers/gitlab- GitLab CI rendererforgeflow/presets- portable helpers likecheckout(),setupNode(), cache helpers, Node and Python pipeline presets, and Bitbucket helpers likebitbucketClone(),bitbucketCacheDefinition(),bitbucketOnFail(), andbitbucketAfterScript()forgeflow/examples- packaged pipeline examplesforgeflow/cli-forgeflow render,check,explain,portability,init, andmigrate
The project now lives in a single TypeScript repo under src/:
src/coresrc/renderers/githubsrc/renderers/bitbucketsrc/renderers/gitlabsrc/presetssrc/examplessrc/cli
Quick start
pnpm install
pnpm build
pnpm test
pnpm test:e2eForgeflow now uses a single-package TypeScript layout with subpath exports and TS path aliases for clean imports.
Author a pipeline
import {
cmd,
ForgeflowAggregateError,
defineJobs,
defineVariables,
expr,
manual,
pipeline,
pullRequest,
push,
render,
uses
} from "forgeflow/core"
import {
checkout,
restoreNodeCache,
saveNodeCache,
setupNode,
uploadArtifact
} from "forgeflow/presets"
import { z } from "zod"
const variables = defineVariables({
env: z.object({
CI: z.boolean(),
NPM_CACHE_DIR: z.string(),
RELEASE_CHANNEL: z.enum(["stable", "canary"]),
RELEASE_VERSION: z.string(),
NODE_AUTH_TOKEN: z.string()
}),
secrets: z.object({
NPM_TOKEN: z.string()
})
})
const ci = pipeline({
provider: "github",
name: "node-ci",
triggers: [push(["main"]), pullRequest(["main"]), manual()],
variables,
env: {
CI: true,
NPM_CACHE_DIR: ".npm",
RELEASE_CHANNEL: "stable"
},
jobs: defineJobs("github", variables, {
build: ({ vars, secrets }) => ({
runsOn: "ubuntu-latest",
if: expr.eq(vars.RELEASE_CHANNEL, "stable"),
strategy: {
matrix: {
values: {
node: [18, 20, 22]
}
}
},
env: {
NPM_CACHE_DIR: vars.NPM_CACHE_DIR,
NODE_AUTH_TOKEN: secrets.NPM_TOKEN
},
outputs: {
node_version: { run: 'node -p "process.version"' }
},
steps: [
uses(checkout()),
uses(setupNode({ version: expr.matrix("node") })),
restoreNodeCache({
key: "npm-cache",
packageManager: "npm",
path: vars.NPM_CACHE_DIR
}),
{
kind: "run",
name: "Install",
run: [cmd`npm ci --cache "${vars.NPM_CACHE_DIR}"`]
},
{ kind: "run", name: "Test", run: ["npm test"] },
saveNodeCache({
key: "npm-cache",
packageManager: "npm",
path: vars.NPM_CACHE_DIR
}),
uploadArtifact({ name: "coverage", path: "coverage" })
]
}),
publish: ({ needs, vars }) => ({
runsOn: "ubuntu-latest",
needs: [needs.build],
env: {
RELEASE_VERSION: expr.jobOutput(needs.build, "node_version")
},
steps: [
uses(checkout()),
{
kind: "run",
name: "Publish summary",
run: [
cmd`echo "publishing ${vars.RELEASE_VERSION} from ${vars.RELEASE_CHANNEL}"`
]
}
]
})
})
})
try {
const result = render(ci)
console.log(result.output)
console.log(result.diagnostics)
} catch (error) {
if (error instanceof ForgeflowAggregateError) {
console.error(error.message)
console.error(error.diagnostics)
} else {
throw error
}
}Variable contracts
Use defineVariables() with pipeline({ provider, variables }) to declare env and secret contracts.
- use
variables.refs.env.NAMEfor typed env references - use
variables.refs.secrets.NAMEfor typed secret references - use
defineJobs("github", variables, { publish: ({ needs }) => ... })to scope typed jobs to a provider while keeping typedneeds.*autocomplete - use
expr.env("NAME")andexpr.secret("NAME")when the name is dynamic or not schema-backed - use
expr.matrix("axis")for matrix references - use
expr.jobOutput("job", "output")for upstream outputs - use
cmd\...`inside shell commands so refs stay typed instead of hardcoded as raw$NAME` strings
import { cmd, defineJobs, defineVariables, pipeline } from "forgeflow/core"
import { z } from "zod"
const variables = defineVariables({
env: z.object({
REGISTRY_IMAGE: z.string(),
RELEASE_CHANNEL: z.enum(["stable", "canary"]),
NODE_AUTH_TOKEN: z.string()
}),
secrets: z.object({
NPM_TOKEN: z.string()
})
})
const release = pipeline({
provider: "github",
name: "release",
variables,
env: {
REGISTRY_IMAGE: "ghcr.io/acme/forgeflow",
RELEASE_CHANNEL: "stable",
NODE_AUTH_TOKEN: variables.refs.secrets.NPM_TOKEN
},
jobs: defineJobs("github", variables, {
build: {
runsOn: "ubuntu-latest",
steps: [{ kind: "run", run: ["npm run build"] }]
},
publish: ({ needs, vars, secrets }) => ({
runsOn: "ubuntu-latest",
needs: [needs.build],
steps: [
{
kind: "run",
name: "Publish",
run: [
cmd`pnpm publish --tag "${vars.RELEASE_CHANNEL}" && echo "${vars.REGISTRY_IMAGE}"`
],
env: {
NODE_AUTH_TOKEN: secrets.NPM_TOKEN
}
}
]
})
})
})Object DSL
Forgeflow now standardizes on the object DSL. Pipelines are plain data plus helper constructors like job(...), run(...), cmd\...`, uses(...), step.*, and defineJobs(provider, ...)when you want typedneeds` refs plus provider-scoped autocomplete.
import { defineJobs, pipeline, push, run } from "forgeflow/core"
const ci = pipeline({
provider: "github",
name: "node-ci",
triggers: [push(["main"])],
jobs: defineJobs("github", {
test: {
runsOn: "ubuntu-latest",
steps: [run("npm ci"), run("npm test")]
},
package: ({ needs }) => ({
runsOn: "ubuntu-latest",
needs: [needs.test],
steps: [run("npm pack")]
})
})
})Bitbucket helpers
Bitbucket-specific step and pipeline fields can stay typed too via forgeflow/presets helpers instead of ad hoc raw payloads.
These helpers compile to Bitbucket-only extensions, so they are ignored when you explicitly render the same pipeline to another provider. Use step.raw(...) only when you want a hard provider-specific escape hatch.
Available helpers include bitbucketClone(), bitbucketOptions(), bitbucketCacheDefinition(), bitbucketCaches(), bitbucketDeployment(), bitbucketSize(), bitbucketChangesetsCondition(), bitbucketOnFail(), bitbucketFailFast(), bitbucketManualTrigger(), bitbucketImage(), and bitbucketAfterScript().
import {
cmd,
defineJobs,
defineVariables,
pipeline,
push,
run
} from "forgeflow/core"
import {
bitbucketAfterScript,
bitbucketCacheDefinition,
bitbucketCaches,
bitbucketClone,
bitbucketDeployment,
bitbucketOnFail,
bitbucketSize
} from "forgeflow/presets"
import { z } from "zod"
const variables = defineVariables({
env: z.object({
DEPLOY_ENV: z.enum(["staging", "production"])
})
})
const deploy = pipeline({
provider: "bitbucket",
name: "deploy",
triggers: [push(["main"])],
variables,
env: {
DEPLOY_ENV: "staging"
},
extensions: [
bitbucketClone({ depth: "full", lfs: true }),
bitbucketCacheDefinition("pnpm-store", {
path: ".pnpm-store",
keyFiles: ["pnpm-lock.yaml"]
})
],
jobs: defineJobs("bitbucket", variables, {
release: ({ vars }) => ({
env: {
DEPLOY_ENV: vars.DEPLOY_ENV
},
extensions: [
bitbucketCaches(["pnpm-store"]),
bitbucketDeployment("staging"),
bitbucketSize("2x"),
bitbucketOnFail({ strategy: "retry", maxRetryCount: 2 }),
bitbucketAfterScript(cmd`echo "cleanup for ${vars.DEPLOY_ENV}"`)
],
steps: [run(cmd`pnpm deploy -- --env "${vars.DEPLOY_ENV}"`)]
})
})
})Advanced examples
For fuller end-to-end examples, see src/examples. Two representative patterns are below.
Multi-job release with artifacts and job outputs
import {
cmd,
defineJobs,
defineVariables,
expr,
manual,
pipeline,
push,
run,
step,
uses
} from "forgeflow/core"
import { checkout, setupNode } from "forgeflow/presets"
import { z } from "zod"
const variables = defineVariables({
env: z.object({
PACKAGE_VERSION: z.string(),
RELEASE_CHANNEL: z.enum(["stable", "canary"]),
NODE_AUTH_TOKEN: z.string()
}),
secrets: z.object({
NPM_TOKEN: z.string()
})
})
const release = pipeline({
provider: "github",
name: "package-release",
triggers: [push(["main"]), manual()],
variables,
env: {
RELEASE_CHANNEL: "stable",
NODE_AUTH_TOKEN: variables.refs.secrets.NPM_TOKEN
},
jobs: defineJobs("github", variables, {
build: {
runsOn: "ubuntu-latest",
outputs: {
version: { run: 'node -p "require(\"./package.json\").version"' }
},
steps: [
uses(checkout()),
uses(setupNode({ version: 20, cache: "npm" })),
run("Install", "npm ci"),
run("Build", "npm run build"),
step.uploadArtifact("package-dist", "dist")
]
},
publish: ({ needs, secrets, vars }) => ({
runsOn: "ubuntu-latest",
needs: [needs.build],
if: expr.eq(expr.branch(), "main"),
env: {
NODE_AUTH_TOKEN: secrets.NPM_TOKEN,
PACKAGE_VERSION: expr.jobOutput(needs.build, "version")
},
steps: [
step.downloadArtifact("package-dist", { fromJob: needs.build }),
run(
"Summary",
cmd`echo "publishing ${vars.PACKAGE_VERSION} to ${vars.RELEASE_CHANNEL}"`
),
run("Publish", cmd`pnpm publish --tag "${vars.RELEASE_CHANNEL}"`)
]
})
})
})Scheduled integration pipeline with services and provider extensions
import {
cmd,
defineJobs,
defineVariables,
githubExt,
manual,
pipeline,
pullRequest,
push,
run,
schedule,
uses
} from "forgeflow/core"
import { checkout, setupNode } from "forgeflow/presets"
import { z } from "zod"
const variables = defineVariables({
env: z.object({
DATABASE_URL: z.string(),
INTEGRATION_REPORTER: z.string()
})
})
const integration = pipeline({
provider: "github",
name: "integration-suite",
triggers: [
push(["main"]),
pullRequest(["main"]),
schedule("0 3 * * *"),
manual()
],
variables,
env: {
INTEGRATION_REPORTER: "dot"
},
jobs: defineJobs("github", variables, {
integration: ({ vars }) => ({
runsOn: "ubuntu-latest",
env: {
DATABASE_URL: "postgresql://postgres:postgres@localhost:5432/app",
INTEGRATION_REPORTER: vars.INTEGRATION_REPORTER
},
services: [
{
name: "postgres",
image: "postgres:16",
env: {
POSTGRES_DB: "app",
POSTGRES_USER: "postgres",
POSTGRES_PASSWORD: "postgres"
},
ports: [5432]
}
],
steps: [
uses(checkout()),
uses(setupNode({ version: 20, cache: "pnpm" })),
run("Install", "pnpm install --frozen-lockfile"),
run(
"Integration tests",
cmd`pnpm test:integration -- --reporter="${vars.INTEGRATION_REPORTER}"`
)
],
extensions: [
githubExt({
concurrency: {
group: "integration-${{ github.ref_name }}",
"cancel-in-progress": true
}
})
]
})
})
})Multiple pipelines
Yes. A repo can define multiple pipeline(...) objects and export them as a manifest.
The CLI now batches render, check, explain, and portability across every exported pipeline in a file. A common pattern is one pipeline for pull requests and one for main:
// forgeflow.ts
import {
cmd,
defineJobs,
defineVariables,
manual,
pipeline,
pullRequest,
push,
run,
uses
} from "forgeflow/core"
import { checkout, setupNode } from "forgeflow/presets"
import { z } from "zod"
const variables = defineVariables({
env: z.object({
RELEASE_CHANNEL: z.enum(["stable", "canary"]),
TEST_REPORTER: z.string()
})
})
export const pullRequests = pipeline({
provider: "github",
name: "pull-request-checks",
triggers: [pullRequest(["main"])],
variables,
env: {
RELEASE_CHANNEL: "stable",
TEST_REPORTER: "dot"
},
jobs: defineJobs("github", variables, {
validate: ({ vars }) => ({
runsOn: "ubuntu-latest",
steps: [
uses(checkout()),
uses(setupNode({ version: 20, cache: "npm" })),
run("npm ci"),
run("Test", cmd`npm test -- --reporter "${vars.TEST_REPORTER}"`)
]
})
})
})
export const main = pipeline({
provider: "github",
name: "main-branch-release",
triggers: [push(["main"]), manual()],
variables,
env: {
RELEASE_CHANNEL: "stable",
TEST_REPORTER: "dot"
},
jobs: defineJobs("github", variables, {
release: ({ vars }) => ({
runsOn: "ubuntu-latest",
steps: [
uses(checkout()),
uses(setupNode({ version: 20, cache: "npm" })),
run("npm ci"),
run(
"Publish dry run",
cmd`pnpm publish --tag "${vars.RELEASE_CHANNEL}" --dry-run`
)
]
})
})
})
export const pipelines = { pullRequests, main }
export default pipelinesforgeflow render ./forgeflow.ts --out ./.forgeflow-out
forgeflow check ./forgeflow.ts
forgeflow portability ./forgeflow.tsPackaged split-pipeline examples are available in:
src/examples/multi-pipeline/pull-requests.tssrc/examples/multi-pipeline/main.ts
Rendering and validation
render() and validate() now throw structured errors when they encounter error diagnostics.
Because pipeline.provider is required, it scopes provider-specific authoring and lets render(pipeline) and CLI commands like render and explain infer a default renderer automatically. You can still pass an explicit renderer when you want to check or render the same pipeline against another target.
ForgeflowAggregateError- one or more compiler or lowering failuresPipelineValidationError- invalid pipeline shape or contractsRendererCapabilityError- target cannot support a requested featureExpressionRenderError- expression cannot be lowered for a provider or contextRendererLoweringError- renderer cannot lower a specific construct
Rendering is fail-closed: if any error-level diagnostics are produced, Forgeflow throws instead of returning YAML.
Warnings and info diagnostics are still returned on successful renders.
Renderer capability overview
| Capability | GitHub | Bitbucket | GitLab | | ---------------- | ------ | -------------------------- | ------ | | Matrix | Yes | No | Yes | | Services | Yes | Yes | Yes | | Cache steps | Yes | Best effort | Yes | | Artifact steps | Yes | Partial | Yes | | Job dependencies | Yes | No | Yes | | Job outputs | Yes | No | Yes | | Manual triggers | Yes | Yes | Yes | | Schedules | Yes | UI-backed custom pipelines | Yes | | Job conditions | Yes | No | Yes | | Step conditions | Yes | No | No |
See the renderer implementations in src/renderers/github, src/renderers/bitbucket, and src/renderers/gitlab for provider-specific details.
CLI
forgeflow --help
forgeflow render --help
forgeflow help render
forgeflow init node-ci --provider github --out ./forgeflow.ts
forgeflow init --template python-ci --provider gitlab --out ./forgeflow.ts
forgeflow migrate .github/workflows/ci.yml --from github --out ./forgeflow.ts --report ./migration-report.txt
forgeflow render ./src/examples/github-release/automation.ts
forgeflow render ./src/examples/multi-pipeline/main.ts --json
forgeflow explain ./src/examples/multi-pipeline/pull-requests.ts
forgeflow portability ./src/examples/node-ci/matrix.ts --json
forgeflow render ./src/examples/node-ci/portable.ts --target github
forgeflow render ./src/examples/node-ci/portable.ts --target gitlab --out ./.gitlab-ci.yml
forgeflow check ./src/examples/monorepo/affected.ts
forgeflow explain ./src/examples/node-ci/matrix.ts --target bitbucketCLI failures are also structured in --json mode, while human mode shows Commander-powered command help:
- invalid invocation ->
CliUsageError - pipeline import/export failures ->
PipelineModuleLoadError - render and validation failures ->
ForgeflowAggregateError
--json is available on render, check, explain, portability, init, and migrate so diagnostics and reports can feed CI, bots, and editor tooling.
Every command now exposes built-in help through forgeflow <command> --help, the root entrypoint exposes forgeflow --help plus forgeflow --version, and Commander also provides forgeflow help <command>.
In human-readable mode:
- Forgeflow uses semantic terminal colors for headings, statuses, success messages, and diagnostics when writing to a TTY;
--jsonoutput and redirected output stay plain - Help output uses a restrained command/argument/option hierarchy inspired by modern developer CLIs; set
NO_COLOR=1to force plain text check,explain, andportabilityprint compact summaries to stdout and send detailed diagnostics to stderr when presentmigrateprints a compact success summary and a structured migration report on stderr when the importer records follow-up issuesrenderprints YAML to stdout; when rendering multiple pipelines without--out, each document is prefixed with YAML comment headers for the pipeline and target and separated with---
Scaffolding and migration
forgeflow initcreates a provider-aware starter file from templates likenode-ci,python-ci,package-release,service-deploy,monorepo, andmulti-pipeline; you can pass the template positionally or with--template <name>forgeflow migrateconverts provider YAML into a best-effort Forgeflow module and emits a migration report for anything preserved raw or not mapped cleanly yet- current migration sources: GitHub Actions, Bitbucket Pipelines, and GitLab CI
forgeflow portabilityruns capability analysis against GitHub, Bitbucket, and GitLab in one command
Bundled examples
Examples live under src/examples and are also importable from forgeflow/examples.
src/examples/node-ci/portable.ts- minimal portable Node.js CIsrc/examples/node-ci/matrix.ts- matrix-based Node.js CIsrc/examples/python-ci/portable.ts- minimal portable Python CIsrc/examples/pnpm-package/ci.ts- preset-first pnpm package CIsrc/examples/bitbucket/service-deploy.ts- Bitbucket-focused deployment pipeline using helper presets for clone, custom caches, deployment, retry policy, changesets conditions, and after-scriptsrc/examples/object-dsl/portable.ts- portable object-DSL pipeline with caches, artifacts, and provider extensionssrc/examples/docker-release/release.ts- multi-job Docker release with caches, artifacts, and outputssrc/examples/monorepo/affected.ts- monorepo pipeline with schedules, services, artifacts, outputs, and provider extensionssrc/examples/release/conditional.ts- conditional release pipeline with job outputs and step conditionssrc/examples/github-release/automation.ts- GitHub-focused release automation using raw escape hatchessrc/examples/multi-pipeline/pull-requests.ts- pull-request-only pipeline for multi-pipeline repossrc/examples/multi-pipeline/main.ts- main-branch pipeline for multi-pipeline repos
import {
bitbucketServiceDeployExample,
conditionalReleaseExample,
mainBranchReleaseExample,
monorepoAffectedExample,
objectDslPortable,
portableNodeCi,
pullRequestChecksExample
} from "forgeflow/examples"Development and release flow
pnpm build- builds the single-package source tree intodistpnpm test- runs the fast unit and contract suitespnpm test:e2e- runs the heavier example-corpus and published-consumer end-to-end suitespnpm typecheck- runs full TypeScript checking acrosssrcandtestspnpm pack/pnpm release- automatically rebuilddistbefore packagingpnpm changeset- creates a release changesetpnpm version-packages- applies Changesets versionspnpm release- publishes the rootforgeflowpackage through Changesets
CI automation lives in:
.github/workflows/ci.yml
Releases are manual. A typical release flow is:
- run
pnpm build && pnpm test && pnpm test:e2e - create or review a changeset with
pnpm changeset - version packages with
pnpm version-packages - publish with
pnpm release
The project includes broader end-to-end coverage:
tests/e2e/providers/*.test.tsrender the packaged example corpus with separate provider-focused suites for GitHub, Bitbucket, and GitLabtests/e2e/cli/smoke.test.tspacks the root package, installs it into a temporary consumer project, and verifiesforgeflow init,migrate,render,check,explain, andportability
