stably
v4.10.4
Published
AI-powered E2E Playwright testing CLI. Stably can understand your codebase, edit/run tests, and handle complex test scenarios for you.
Downloads
11,907
Readme
Stably CLI
This package extends Playwright to add new AI functionality. To get started quickly, please see AI-assisted setup guide. Otherwise continue to read below.
Installation
npm i -g stably🎭 Note
We let you bring your own Playwright version. This does mean that Playwright must first be setup (our CLI can help you do that)
Usage
Below is a short list of common commands. For a complete list, please use stably --help
npx stably: This starts our REPL which will help with test creation or modificationsnpx stably test: Use this to run tests locally or in the CInpx stably --help: Will print full list of commands
Authentication
Stably CLI supports two auth modes:
- API key via env vars (highest priority): set
STABLY_API_KEYandSTABLY_PROJECT_ID - Browser login (OAuth): run
npx stably login(credentials are stored locally)
If both are present, the CLI will honor the environment variables and warn that stored OAuth credentials are being ignored. To switch back to browser login, unset the env vars.
Cheap local/CI agent testing (Ollama)
You can run the agent against Ollama via its Anthropic-compatible API (cheap, local, and deterministic enough for smoke tests).
This is dev-only behavior (it is ignored in NODE_ENV=production builds).
STABLY_BYPASS_AI_PROXY=1(dev-only; ensures we don't overrideANTHROPIC_BASE_URL)- Dev-only convenience:
STABLY_USE_OLLAMA=1(dev-only; ignored whenNODE_ENV=production) STABLY_AGENT_MODEL=qwen2.5-coder:0.5b(fastest/cheapest suggested default)
Ollama: fastest local model (recommended)
If you want a convenience helper that starts Ollama (if needed), pulls the model, and prints the exact env vars to use:
pnpm ollama:dev-envNotes:
- On Linux, this script will try to install Ollama automatically (requires
sudoor root). - On macOS, it will try Homebrew if available (
brew install ollama/brew install --cask ollama); otherwise it will prompt you to install Ollama manually.
- Start Ollama and pull the tiny model:
ollama serve
ollama pull qwen2.5-coder:0.5b- Run the CLI against Ollama (Anthropic-compatible API):
STABLY_BYPASS_AI_PROXY=1 STABLY_USE_OLLAMA=1 STABLY_AGENT_MODEL=qwen2.5-coder:0.5b pnpm dev