qualink
v0.6.0
Published
Collect, normalize, and relay code quality metrics from CI
Maintainers
Readme
qualink
Collect, normalize, and relay code quality metrics from CI.
qualink standardizes code quality telemetry across repos and languages, then ships it to a sink (Elastic for now, if you need something else do a PR or a ticket).
Install
npm install -g qualink
pnpm add -g qualink
bun add -g qualinkOr run directly:
npx qualink collect eslint --input report.json --sink stdoutCI Examples
Repo, branch, commit SHA, pipeline run ID, and provider are auto-detected from CI environment variables, no need to pass them manually.
See the examples/ folder for copy-paste snippets for Azure DevOps and GitHub Actions.
Pipeline Tracking
Track pipeline execution metrics — which pipelines run, when, for how long, and their outcome.
Pipelines self-report by calling qualink pipeline --status <status> at the end of a run.
Azure DevOps
steps:
- script: echo "##vso[task.setvariable variable=PIPELINE_START]$(date +%s%3N)"
displayName: Record start time
# ... existing build/test steps ...
- script: |
END_TIME=$(date +%s%3N)
DURATION=$(( END_TIME - $(PIPELINE_START) ))
npx qualink pipeline \
--status "$(Agent.JobStatus)" \
--duration "$DURATION" \
--sink elastic
displayName: Report pipeline metrics
condition: always()
env:
ELASTIC_URL: $(ELASTIC_URL)
ELASTIC_API_KEY: $(ELASTIC_API_KEY)Auto-detected from Azure DevOps env: pipeline name (BUILD_DEFINITIONNAME), trigger (BUILD_REASON), repo, branch, commit, run ID, provider.
GitHub Actions
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Record start time
run: echo "PIPELINE_START=$(date +%s%3N)" >> "$GITHUB_ENV"
# ... existing build/test steps ...
- name: Report pipeline metrics
if: always()
run: |
END_TIME=$(date +%s%3N)
DURATION=$(( END_TIME - PIPELINE_START ))
npx qualink pipeline \
--status "${{ job.status }}" \
--duration "$DURATION" \
--sink elastic
env:
ELASTIC_URL: ${{ secrets.ELASTIC_URL }}
ELASTIC_API_KEY: ${{ secrets.ELASTIC_API_KEY }}Auto-detected from GitHub env: pipeline name (GITHUB_WORKFLOW), trigger (GITHUB_EVENT_NAME), repo, branch, commit, run ID, provider.
Per-stage reporting
For pipelines with distinct stages, call qualink once per stage with --stage-name:
# Azure DevOps example
- script: |
npx qualink pipeline --status "$(Agent.JobStatus)" --stage-name build --duration "$BUILD_DURATION"
condition: always()
- script: |
npx qualink pipeline --status "$(Agent.JobStatus)" --stage-name deploy --duration "$DEPLOY_DURATION"
condition: always()CLI usage
Single collector
qualink collect <collector> --input <path> --sink elastic [flags]qualink collect eslint --input eslint-report.json --sink elastic --repo frontend-mono --category frontend --tags frontend,web
qualink collect sarif --input analyzers.sarif --sink elastic --repo backend-api --category backend --tags backend,api
qualink collect coverage-dotnet --input coverage.cobertura.xml --sink elastic --repo backend-apiMulti-collect
Auto-discover report files in a directory tree:
qualink collect --dir=./output --repo myapp --sink elasticOr use a config file for explicit control:
qualink collect --config=qualink.json --repo myapp --sink elasticConfig file example (qualink.json):
[
{ "type": "eslint", "input": "packages/*/eslint-report.json" },
{ "type": "coverage-js", "input": "packages/*/coverage-summary.json" },
{ "type": "sarif", "input": "**/*.sarif" }
]Each entry supports optional overrides: tags, category, project, solution, url.
See qualink-config.schema.json for the full schema.
Auto-discovery recognizes: eslint-report.json, biome-report.json, coverage-summary.json, coverage.cobertura.xml, *.sarif/*.sarif.json, lhr-*.json inside .lighthouseci/, junit.xml, and TEST-*.xml.
Pipeline tracking
Top-level command, not under collect:
qualink pipeline --status succeeded --sink elastic
qualink pipeline --status succeeded --duration 125000 --pipeline-name "Build and Deploy"
qualink pipeline --status succeeded --stage-name build --duration 45000
qualink pipeline --status failed --dry-runCollectors:
biome(Biome JSON)eslint(ESLint JSON)lighthouse(Lighthouse JSON)coverage-js(Istanbul/Vitest JSON)sarif(Roslyn or generic SARIF JSON)coverage-dotnet(Cobertura/OpenCover XML)junit(JUnit XML)
ESLint file-level options (optional):
--top-files <n>addstop_fileswith the top offending files (default:0, disabled)--include-all-filesaddsall_fileswith every file that has lint issues
Classification metadata (optional):
--categoryfor a single broad bucket--tagsfor flexible multi-label filtering (comma,separated)
Project hierarchy (auto-detected or explicit):
--solutiongroups related projects (auto-detected from.slnor workspace rootpackage.json)--projectidentifies the individual project (auto-detected from nearest.csprojorpackage.json)
Metadata auto-detection:
repo: from flag/env, then git origin, then current folder namebranch: from flag/env, then git branchcommit_sha: from flag/env, then git commitpipeline_run_id: from flag/env, fallbacklocal-<timestamp>project: from flag/env, then nearest.csproj/package.jsonsolution: from flag/env, then nearest.sln/workspace rootpackage.json
If needed, you can still pass explicit values with --repo, --branch, --commit-sha, and --pipeline-run-id.
Sink configuration:
--sink elastic(default) requiresELASTIC_URLandELASTIC_API_KEY--sink lokirequiresLOKI_URL. Optional:LOKI_USERNAME,LOKI_PASSWORD(basic auth),LOKI_TENANT_ID(X-Scope-OrgIDheader for multi-tenant setups)--sink stdoutprints normalized documents for debugging
Dry run mode:
--dry-runvalidates and prints normalized payloads without sending to any sink
Useful env fallbacks:
QUALINK_REPO,QUALINK_CATEGORY,QUALINK_TAGS,QUALINK_BRANCH,QUALINK_COMMIT_SHA,QUALINK_PIPELINE_RUN_IDQUALINK_PROJECT(auto-detected from nearest.csproj/package.jsonorPNPM_PACKAGE_NAME)QUALINK_SOLUTION(auto-detected from.slnor workspace rootpackage.json)QUALINK_PIPELINE_PROVIDER(auto-detected, fallback:local)QUALINK_ENVIRONMENT(default:ci)QUALINK_SINK(default:elastic)
