@mrzacsmith/gai-up-cli
v0.1.4
Published
CLI tool to monitor React/Vite + Firebase Auth sites with LLM-powered baseline generation
Maintainers
Readme
Monitor CLI
A powerful CLI tool to monitor React/Vite + Firebase Auth sites with LLM-powered baseline generation and automated content verification.
Features
- 🔍 Automated Page Discovery - Automatically crawl and discover all pages on your site
- 🤖 LLM-Powered Baselines - Use AI to generate intelligent baselines for page content
- 🔐 Secure Authentication - Persistent browser sessions with Firebase Auth support
- 🎯 Flexible Scan Modes - Monitor auth-only, public, or both page types
- 📊 Rich Reporting - HTML dashboards with similarity scores and detailed comparisons
- 🚀 Fast Verification - Deterministic checks (no LLM calls after baseline generation)
- 💾 SQLite Database - Track history, baselines, and check results
- 🔒 Secure Storage - All credentials stored in system keychain
Installation
Install from npm (Recommended)
# Install globally
npm install -g @mrzacsmith/gai-up-cli
# Install Playwright browsers (required)
# Use -p flag to explicitly use the playwright package from the global installation
npx -p playwright playwright install chromium
# Verify installation
monitor --helpInstall from Source
# Clone the repository
git clone <repository-url>
cd gai-up-cli
# Install dependencies
npm install
# Build the project
npm run build
# Install globally (optional)
npm install -g .Quick Start
1. Configure LLM Provider
First, set up your LLM API key. OpenRouter is recommended for cost-effective models:
# Set OpenRouter API key (recommended)
monitor config llm --provider openrouter --api-key <your-key>
# Or use OpenAI
monitor config llm --provider openai --api-key <your-key>
# Or use Anthropic
monitor config llm --provider anthropic --api-key <your-key>
# View current configuration
monitor config llm show2. Discover Pages
The discover command automatically finds all pages on your site and detects login URLs:
# Discover all pages (auto-adds site if it doesn't exist)
monitor discover scan gauntletai.com
# With custom depth
monitor discover scan gauntletai.com --max-depth 5
# Update existing pages
monitor discover scan gauntletai.com --updateWhat it does:
- Automatically adds the site if it doesn't exist
- Crawls the site to discover all pages
- Detects login URLs automatically
- Flags pages as requiring authentication or public
3. Authenticate (if needed)
If your site requires authentication, log in once. The session will persist automatically:
# Interactive login (opens browser)
monitor auth login gauntletai.comNote: You only need to log in once! The CLI saves your session and uses it automatically for all future runs.
4. Generate Baselines
Use the LLM to analyze pages and create baselines:
# Learn all pages
monitor learn gauntletai.com
# Learn specific page
monitor learn gauntletai.com --page /about
# Learn only auth pages
monitor learn gauntletai.com --auth-only
# Learn only public pages
monitor learn gauntletai.com --no-auth
# Update existing baseline
monitor learn gauntletai.com --page /about --update5. Run Checks
Verify pages against their baselines:
# Check all sites
monitor check
# Check specific site
monitor check gauntletai.com
# Check specific page
monitor check gauntletai.com --page /about
# Check only auth pages
monitor check gauntletai.com --auth-only
# Check only public pages
monitor check gauntletai.com --no-auth6. View Dashboard
View the HTML dashboard with detailed results:
# Open dashboard in browser
monitor report dashboard gauntletai.com
# View latest report status
monitor report status
# View check history
monitor report history gauntletai.comComplete Workflow Example
# 1. Configure LLM
monitor config llm --provider openrouter --api-key sk-or-v1-xxx
# 2. Discover pages (auto-detects login URL)
monitor discover scan gauntletai.com
# 3. Authenticate (if needed)
monitor auth login gauntletai.com
# 4. Generate baselines
monitor learn gauntletai.com
# 5. Run checks
monitor check gauntletai.com
# 6. View dashboard
monitor report dashboard gauntletai.comCommands Reference
Site Management
# Add a site manually
monitor site add <domain> --name "Site Name" --base-url <url> [--login-url <url>]
# List all sites
monitor site list
# Show site details
monitor site info <domain>
# Remove a site (requires --force)
monitor site remove <domain> --force
# Reset a site (clears all data, keeps site entry)
monitor site reset <domain> --force
# Manage pages
monitor site pages list <domain>
monitor site pages add <domain> --page <path>
monitor site pages remove <domain> --page <path>Configuration
# LLM Configuration
monitor config llm --provider <openrouter|openai|anthropic> --api-key <key>
monitor config llm --model <model-name>
monitor config llm show
monitor config llm clear
# Reports Directory
monitor config reports set <path>
monitor config reports show
# Data Directory (database location)
monitor config data set <path>
monitor config data show
# Browser Profiles Directory
monitor config browser set <path>
monitor config browser showAuthentication
# Interactive login (opens browser)
monitor auth login <domain>
# Check auth status
monitor auth status
# Refresh authentication
monitor auth refresh
# Manage credentials (for non-Firebase auth)
monitor auth credentials set <domain> --username <user> --password <pass>
monitor auth credentials remove <domain>Discovery
# Discover pages on a site
monitor discover scan <domain>
# Options:
# --max-depth <depth> Maximum crawl depth (default: 5)
# --update Update existing pagesLearning (Baseline Generation)
# Generate baselines for all pages
monitor learn <domain>
# Options:
# --page <path> Learn a specific page only
# --update Update existing baseline
# --auth-only Only learn pages requiring authentication
# --no-auth Only learn public/non-auth pages
# --both Learn both auth and non-auth pages (default)
# Manage baselines
monitor learn baseline show <domain> --page <path>
monitor learn baseline reset <domain> [--page <path>]Checking
# Check all sites
monitor check
# Check specific site
monitor check <domain>
# Options:
# --page <path> Check a specific page only
# --auth-only Only check pages requiring authentication
# --no-auth Only check public/non-auth pages
# --both Check both (default)
# --watch Continuous monitoring mode
# --interval <interval> Check interval (e.g., 5m, 1h)
# --parallel Run checks in parallelReporting
# View HTML dashboard
monitor report dashboard <domain> [--report-file <file>]
# Show last check results
monitor report status
# View check history
monitor report history <domain>
# Export report
monitor report export <domain> --format <html|json> [--output <path>]Update
# Check for updates and update to latest version
monitor update
# Or use the alias
monitor upHelp
Get help for any command:
# General help
monitor help
# Help for specific command
monitor help <command>
monitor <command> --helpConfiguration File
Create a .monitorrc.yml file in your project root (optional):
# LLM Configuration
llm:
provider: "openrouter" # openrouter, openai, or anthropic
model: "gpt-4o-mini" # Model name
temperature: 0.3
max_tokens: 2000
# Browser Settings
browser:
headless: true
timeout: 30000
viewport:
width: 1280
height: 720
# Reports Directory
reports:
directory: "./monitor-reports"
# Data Directory (database location)
data:
directory: "./data"
# Browser Profiles Directory
browser:
directory: "./browser-profiles"Note: API keys are NEVER stored in config files. They are stored securely in your system keychain.
How It Works
1. Discovery Phase
- Crawls your site to find all pages
- Automatically detects login URLs
- Flags pages as requiring authentication or public
- Stores page metadata in SQLite database
2. Learning Phase
- Uses LLM to analyze each page's content
- Generates structured baseline JSON with:
- Page title expectations
- Navigation items
- Critical sections
- Important links
- Footer content
- Stores baselines in database
3. Checking Phase
- Verifies pages against baselines using deterministic code
- Calculates similarity scores (0-100%)
- Reports status:
match(≥90%),partial(≥60%),changed(<60%) - Generates detailed comparison reports
- Saves scan reports as JSON and HTML
4. Reporting Phase
- Generates HTML dashboard with:
- Summary statistics
- Per-page results with similarity scores
- Detailed comparisons showing what changed
- Historical scan selection
- Stores reports in
monitor-reports/{domain}/
Authentication Persistence
The CLI uses persistent browser profiles to maintain authentication:
- First login: Run
monitor auth login <domain>once - Automatic reuse: All subsequent commands use the saved session
- Firebase Auth: Tokens stored in IndexedDB are automatically persisted
- Re-login needed: Only if tokens expire (usually 30-60 days) or profile is deleted
Scan Report Format
Reports are saved with the format: {domain}-MM-DD-YYYY-hh-mm.json
Example: gauntletai-01-15-2024-14-30.json
The dashboard allows you to:
- View latest scan results
- Select and view previous scans
- See similarity scores and detailed comparisons
- Export reports in HTML or JSON format
Troubleshooting
Login URL Not Detected
If the discover command doesn't detect your login URL, you can set it manually:
monitor site add <domain> --name "Site" --base-url <url> --login-url <login-url>Authentication Not Persisting
If authentication isn't persisting:
- Check that the browser profile directory exists
- Try logging in again:
monitor auth login <domain> - Check browser profile path:
monitor site info <domain>
Pages Not Being Discovered
If pages aren't being discovered:
- Increase max depth:
monitor discover scan <domain> --max-depth 7 - Check if pages require authentication (may need to login first)
- Verify the site is accessible
Baseline Generation Failing
If baseline generation fails:
- Check LLM API key:
monitor config llm show - Verify you have API credits/quota
- Check network connectivity
- Try with a specific page:
monitor learn <domain> --page <path>
Project Structure
gai-up-cli/
├── src/
│ ├── cli/ # CLI commands
│ ├── core/ # Core functionality (browser, crawler, verifier, llm)
│ ├── db/ # Database schema and queries
│ ├── utils/ # Utilities (config, reports, markdown)
│ └── schemas/ # Zod schemas (if needed)
├── bin/ # Executable entry point
├── data/ # SQLite database (monitor.db)
├── browser-profiles/ # Browser session data (per site)
├── monitor-reports/ # Scan reports and dashboards (per domain)
├── .monitorrc.yml # Configuration file (optional)
└── package.jsonDevelopment
# Install dependencies
npm install
# Build TypeScript
npm run build
# Development mode (watch)
npm run dev
# Run CLI locally
node bin/monitor.js <command>Security
- Credentials: All API keys, usernames, and passwords stored in system keychain (keytar)
- Config Files: Never contain credentials, only settings
- Database: Contains only metadata, no sensitive data
- Browser Profiles: Contain session data (should be gitignored)
- Reports: May contain page content (should be reviewed before sharing)
License
ISC
