npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

sdlc-ai-oe

v3.0.3

Published

SDLC gate checker — CLI & Web UI — using local SLM (Ollama) or cloud AI. Weighted eval, severity, consensus, code eval, auto-fix, pipeline gen.

Downloads

422

Readme

sdlc-gate

SDLC Gate Checker — validate specs and code against SDLC phase checklists using AI (local Ollama SLM or Azure OpenAI).

Available as both a CLI tool and a web UI.

Install

# Global install (makes `sdlc-gate` available everywhere)
npm install -g sdlc-gate

# Or run without installing
npx sdlc-gate <spec-file> --cloud

# Or clone and run locally
git clone <repo-url> && cd slm_sdlc
npm install

CLI Usage

sdlc-gate <spec-file> [options]

Quick Start

# Check a markdown spec against the spec-to-dev phase
sdlc-gate spec.md --cloud

# Check a PDF spec (with embedded image/diagram detection)
sdlc-gate design.pdf --cloud --phase=spec-to-dev

# Check a Word document
sdlc-gate requirements.docx --cloud

# Use local Ollama SLM instead of cloud
sdlc-gate spec.md --slm

Commands

| Command | Description | |---------|-------------| | sdlc-gate <file> | Single spec gate check (md/pdf/docx) | | sdlc-gate <dir/> --batch | Batch check all spec files in directory | | sdlc-gate <file> --eval | Weighted evaluation across all phases | | sdlc-gate <file> --severity | Issue severity classification | | sdlc-gate <file> --consensus | Multi-model consensus (SLM + Cloud) | | sdlc-gate <file> --code | Code evaluation mode | | sdlc-gate <file> --auto-fix | Auto-fix failing spec | | sdlc-gate --template | Generate spec template | | sdlc-gate --phases | List available gate phases | | sdlc-gate --configs | List available team configs | | sdlc-gate --history | Show gate check history | | sdlc-gate --gen-pipeline | Generate CI/CD pipeline YAML | | sdlc-gate --pr=<url> | Run gate check on a GitHub/ADO PR | | sdlc-gate --serve | Start the web UI server | | sdlc-gate --version | Show version | | sdlc-gate --help | Show help |

Options

| Option | Description | |--------|-------------| | --slm | Use local SLM via Ollama (default) | | --cloud | Use cloud AI endpoint (Azure OpenAI) | | --phase=<id> | Gate phase (default: spec-to-dev) | | --config=<path> | Use a custom gate config JSON | | --diff | Diff-aware re-check vs last run | | --severity | Add severity classification | | --lang=<lang> | Language hint for --code (default: auto) | | --platform=<p> | Pipeline platform: github or azure | | --no-comment | Skip posting PR comment | | --no-status | Skip setting commit status |

Supported File Types

  • .md — Markdown (plain text)
  • .pdf — PDF with embedded image/diagram detection
  • .docx — Microsoft Word documents
  • .txt — Plain text

Exit Codes

| Code | Meaning | |------|---------| | 0 | PASS — spec meets all gate checks | | 2 | FAIL — spec has missing items | | 1 | Error (config, network, etc.) |

Web UI

# Start the web server (port 3000)
sdlc-gate --serve

# Or directly
node server.mjs

# Or via npm
npm start

Then open http://localhost:3000 in your browser.

Configuration

Runtime Config (priority order)

  1. Environment variables
  2. runtime-config.local.json (recommended for secrets)
  3. runtime-config.json
  4. .env (legacy fallback)

Config Keys

| Key | Description | |-----|-------------| | OLLAMA_URL | Ollama endpoint (default: http://localhost:11434) | | OLLAMA_MODEL | Ollama model (default: phi3:mini) | | CLOUD_PROVIDER | Cloud AI provider | | CLOUD_API_KEY | Cloud API key | | CLOUD_API_URL | Cloud API endpoint URL | | CLOUD_MODEL | Cloud model name | | GITHUB_TOKEN | GitHub token for PR integration | | ADO_TOKEN | Azure DevOps token | | ADO_ORG | Azure DevOps organization |

Example runtime-config.local.json

{
  "CLOUD_API_KEY": "your-api-key",
  "CLOUD_API_URL": "https://your-endpoint.openai.azure.com/openai/deployments/gpt-5-mini/chat/completions?api-version=2024-12-01-preview",
  "CLOUD_MODEL": "gpt-5-mini",
  "CLOUD_PROVIDER": "azure"
}

Custom Gate Configs

Place team-specific JSON configs in the configs/ directory:

sdlc-gate spec.md --config=configs/my-team.json --cloud

Available Phases

| Phase | Description | |-------|-------------| | spec-to-dev | Specification readiness for development | | dev-to-testing | Development readiness for testing | | testing-to-staging | Testing readiness for staging | | staging-to-production | Staging readiness for production | | security-compliance | Security and compliance checks |

Requirements

  • Node.js 20+
  • Ollama (for --slm mode) or Azure OpenAI API key (for --cloud mode)

License

ISC