npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ado-sync

v0.1.68

Published

Bidirectional sync between local test specs (Cucumber/Markdown) and Azure DevOps Test Cases

Readme

ado-sync

Bidirectional sync between local test specs and Azure DevOps Test Cases.

Supports Gherkin .feature, Markdown .md, C#, Java, Python, JavaScript/TypeScript (Jest, Playwright, Cypress, Puppeteer, TestCafe, Detox), Swift (XCUITest), Kotlin/Java (Espresso), Dart (Flutter), Robot Framework, Go, RSpec, PHPUnit, Rust, CSV, and Excel.

Also available as a VS Code Extension — CodeLens, sidebar tree, and all commands without leaving the editor.


Quick start

npm install -g ado-sync
ado-sync init            # interactive wizard → creates ado-sync.json
export AZURE_DEVOPS_TOKEN=your_pat
ado-sync push --dry-run  # preview
ado-sync push            # create / update Test Cases

Minimum config:

{
  "orgUrl": "https://dev.azure.com/YOUR-ORG",
  "project": "YOUR-PROJECT",
  "auth": { "type": "pat", "token": "$AZURE_DEVOPS_TOKEN" },
  "testPlan": { "id": 12345 },
  "local": { "type": "gherkin", "include": "specs/**/*.feature" }
}

Commands

| Command | Purpose | |---------|---------| | init | Interactive config wizard | | validate | Check config and Azure connectivity | | push | Local specs → Azure Test Cases | | pull | Azure Test Cases → local files | | status | Show pending changes without modifying anything | | diff | Field-level drift between local and Azure | | generate | Scaffold spec files from ADO User Stories (template-first, optionally AI-enhanced) | | publish-test-results | Publish TRX / JUnit / Playwright / Cucumber results to a Test Run | | story-context | Show AC, suggested tags, and linked TCs for a User Story | | coverage | Spec link rate and story coverage report | | stale | List (and optionally retire) Azure TCs with no local spec | | ac-gate | Validate stories have AC and linked Test Cases — CI quality gate | | find-tagged | Find work items where a specific tag was added in the last N hours/days | | trend | Flaky test detection and pass-rate trends over historical runs | | watch | Auto-push on file save |


Output symbols

+  created    — new Test Case created in Azure DevOps
~  updated    — existing Test Case updated
↓  pulled     — local file updated from Azure DevOps
=  skipped    — no changes detected
!  conflict   — both sides changed (see conflictAction)
−  removed    — local scenario deleted; tagged ado-sync:removed in Azure
↓  retired    — stale TC closed (stale --retire)
✗  error      — something went wrong

Documentation

| Topic | Link | |-------|------| | CLI reference | docs-site/docs/cli.md | | Configuration reference | docs-site/docs/configuration.md | | Capability roadmap | docs-site/docs/capability-roadmap.md | | Spec file formats | docs-site/docs/spec-formats.md | | Workflow examples (per framework + CI) | docs-site/docs/workflows.md | | Work item links | docs-site/docs/work-item-links.md | | Publishing test results | docs-site/docs/publish-test-results.md | | Advanced features | docs-site/docs/advanced.md | | AI agent setup | docs-site/docs/agent-setup.md | | MCP Server | docs-site/docs/mcp-server.md | | VS Code Extension | docs-site/docs/vscode-extension.md | | Troubleshooting | docs-site/docs/troubleshooting.md |


AI providers

ado-sync supports multiple AI providers for test-step summarisation (push/pull/status), spec generation (generate), and failure analysis (publish-test-results). All provider SDKs are optional — install only what you need.

generate does not crawl the whole repo by default. It uses ADO story data first, and you can add targeted app/test context explicitly when you want richer AI-generated specs.

| Provider | Commands | SDK (install separately) | Auth | |---|---|---|---| | heuristic | push / pull / status | none — built-in | none | | local | push / pull / status | node-llama-cpp (included) | GGUF model file path | | ollama | all AI commands | npm i ollama | local Ollama server | | docker | all AI commands | npm i openai | Docker Desktop with Model Runner — no API key | | openai | all AI commands | npm i openai | $OPENAI_API_KEY | | anthropic | all AI commands | npm i @anthropic-ai/sdk | $ANTHROPIC_API_KEY | | huggingface | all AI commands | npm i openai | $HF_TOKEN | | bedrock | all AI commands | npm i @aws-sdk/client-bedrock-runtime | AWS credential chain | | azureai | all AI commands | npm i openai | $AZURE_OPENAI_KEY + --ai-url | | github | all AI commands | npm i openai | $GITHUB_TOKEN (auto-detected) | | azureinference | all AI commands | npm i @azure-rest/ai-inference @azure/core-auth | $AZURE_AI_KEY + --ai-url |

AI quick guide

  • Use heuristic for zero-setup summaries.
  • Use local for GGUF models through node-llama-cpp.
  • Use ollama if you already manage models with Ollama.
  • Use docker for Docker Model Runner — local inference via Docker Desktop, no API key required.
  • Use openai, anthropic, github, bedrock, azureai, or azureinference for hosted/provider-backed generation.
  • For generate, pass a small set of relevant files with --ai-context instead of the whole repo.

Examples:

# Fastest setup
ado-sync push --ai-provider heuristic

# Local inference via Docker Desktop (no API key)
ado-sync push --ai-provider docker --ai-model ai/llama3.2

# Hosted model
ado-sync push --ai-provider github --ai-model gpt-4o

# Repo-aware spec generation with targeted context
ado-sync generate --story-ids 1234 \
  --ai-provider openai --ai-key $OPENAI_API_KEY \
  --ai-context src/orders/** \
  --ai-context tests/orders/** \
  --ai-context docs/orders.md

Set once in ado-sync.json to apply to all commands:

{ "sync": { "ai": { "provider": "github", "model": "gpt-4o" } } }

Detailed AI setup, local-model guidance, provider-specific examples, and generate-context recommendations are in the docs:

LLM / AI crawlers: llms.txt contains a single-file summary of the entire project — config schema, CLI flags, ID writeback formats, and the full doc index.