npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

vibelinter

v0.0.4

Published

VibeLinter - A CLI tool for checking SEO compliance of websites

Readme

VibeLinter - SEO Compliance Tool

A powerful CLI tool for comprehensive SEO compliance checking of websites, supporting both production domains and local development environments.

Features

  • 🕷️ Smart Crawler: Traverse website internal links and check every page
  • 📊 Comprehensive Checks: Analyze meta tags, heading hierarchy, image optimization, and more SEO factors
  • 🔗 Link Validation: Check accessibility of internal and external links
  • 📈 Multiple Report Formats: Support console, JSON, and HTML output formats
  • 🌐 HTTP Server: View reports at /check endpoint
  • ⚙️ Flexible Configuration: Support custom rule configuration
  • ⏱️ Timeout Protection: Prevent crawler from hanging with smart timeout and error recovery
  • 📱 Interactive Reports: Collapsible interface with expandable detailed information
  • 🤖 Agent Mode: Structured output for AI tools and automated workflows

Installation

# Use directly with npx (recommended)
npx vibelinter check https://example.com

# Or install globally
npm install -g vibelinter

For local development:

git clone <repository>
cd vibelinter
npm install
npm run build

Usage

Basic Usage

# Check website
vibelinter check https://example.com

# Check localhost
vibelinter check http://localhost:3000

# Specify crawler depth
vibelinter check https://example.com --depth 5

# Use custom configuration
vibelinter check https://example.com --config ./vibelinter.config.cjs

# Output HTML report
vibelinter check https://example.com --output html --out-file report.html

# Output JSON format
vibelinter check https://example.com --output json

# Agent mode for AI integration
vibelinter check https://example.com --mode=agent

# Start server after check
vibelinter check https://example.com --server

# Start report server separately
vibelinter server --port 3000

View All Rules

vibelinter rules

Configuration File

Create a vibelinter.config.cjs file in your project root:

module.exports = {
  rules: {
    'meta/title': {
      enabled: true,
      severity: 'error',
      minLength: 30,
      maxLength: 60
    },
    'images/alt-text': {
      enabled: true,
      severity: 'warning'
    }
    // ... more rule configurations
  },
  crawler: {
    maxDepth: 3,
    rateLimit: 100,
    userAgent: 'VibeLinter/0.0.2 (https://vibelinter.com)',
    timeout: 30000,
    followRedirects: true,
    respectRobotsTxt: true,
    concurrency: 5
  },
  exclude: ['/admin/*', '/api/*', '*.pdf'],
  include: ['/*'],
  server: {
    enabled: false,
    port: 3000,
    host: 'localhost'
  }
}

SEO Check Rules

Meta Tags

  • Title: Length and uniqueness validation
  • Description: Length and content quality checks
  • Canonical: URL canonicalization validation
  • Robots: Crawler directive validation
  • Viewport: Mobile optimization checks
  • Favicon: Favicon presence and format

Headings

  • H1: Uniqueness and length validation
  • Hierarchy: Heading structure validation
  • Keywords: Keyword usage analysis

Images

  • Alt Text: Alternative text validation
  • File Size: File size optimization checks
  • Format: Modern format usage validation
  • Lazy Loading: Lazy loading implementation checks
  • Accessibility: Image accessibility compliance

Social Media

  • Social Links: Social media presence validation
  • Open Graph: OG meta tags validation
  • OG Image: Social sharing image validation

Legal & Compliance

  • Privacy Policy: Privacy policy link validation
  • Terms of Service: Terms link validation

Performance

  • Core Web Vitals: Performance metrics validation

Accessibility

  • WCAG Compliance: Web accessibility guidelines

Structured Data

  • 🔄 Schema Markup: Structured data validation (in development)

Security

  • HTTPS Security: SSL/TLS security checks

Crawling

  • Robots.txt: Robots.txt file validation

HTTP Server

VibeLinter can start an HTTP server to view reports:

# Start server after check
vibelinter check https://example.com --server

# Or start server separately
vibelinter server

Then visit http://localhost:3000/check to view the latest SEO report.

API Endpoints

  • GET /check - View HTML format report
  • GET /check/json - View JSON format report
  • POST /check - Upload new report data
  • GET /health - Health check

Agent Mode

VibeLinter supports agent mode for AI tools and automated workflows:

vibelinter check https://example.com --mode=agent

Agent mode provides structured output with:

  • Clear error/warning categorization
  • Actionable fix suggestions
  • Machine-readable format
  • Detailed issue descriptions

Example Output

Console Output

🔍 Starting VibeLinter check...
Crawling https://example.com with depth 3...
✅ Crawl completed: 3 pages processed, 5 external links found

=== SEO Check Report ===

Summary:
  Total Pages: 3
  Total Issues: 5
  Errors: 1
  Warnings: 3
  Info: 1

📄 Homepage
  URL: https://example.com
  Load Time: 250ms
  
  Errors:
    ❌ [meta/title] Missing <title> tag
  
  Warnings:
    ⚠️ [images/alt-text] Missing alt attribute
       Current: /images/hero.jpg

Agent Mode Output

SITE: https://example.com
ERRORS: 1
WARNINGS: 3

## ERRORS TO FIX:
1. [meta/title] Missing <title> tag
   Page: /
   Fix: Add <title> tag to document head

## WARNINGS TO REVIEW:
1. [images/alt-text] Missing alt attribute (3 images)
   Page: /
   Suggestion: Add descriptive alt text to all images

JSON Output

{
  "pages": [...],
  "summary": {
    "totalPages": 3,
    "totalIssues": 5,
    "errorCount": 1,
    "warningCount": 3,
    "infoCount": 1
  },
  "externalLinks": [...]
}

Configuration Options

Rule Configuration

Each rule supports the following configuration:

  • enabled: Whether to enable the rule
  • severity: Issue severity level ('error' | 'warning' | 'info')
  • Rule-specific parameters (like length limits, etc.)

Crawler Configuration

  • maxDepth: Maximum crawl depth
  • rateLimit: Request interval (milliseconds)
  • userAgent: User agent string
  • timeout: Request timeout
  • followRedirects: Whether to follow redirects
  • respectRobotsTxt: Whether to respect robots.txt
  • concurrency: Number of concurrent requests

Filter Configuration

  • exclude: URL patterns to exclude
  • include: URL patterns to include
  • internalLinkWhitelist: Whitelisted internal link patterns

Development

# Install dependencies
npm install

# Development mode
npm run dev

# Build
npm run build

# Run tests
npm test

# Code linting
npm run lint

# Create release
npm run release:patch

Error Handling and Timeout Mechanism

VibeLinter includes comprehensive error handling and timeout protection to prevent crawler hanging or infinite waiting:

Timeout Protection

  • Single page timeout: 30 seconds per page request
  • Total crawl timeout: 5 minutes for entire crawl process
  • External link check timeout: 8 seconds for external link validation
  • Maximum page limit: Crawl up to 100 pages maximum

Error Recovery

  • Network errors: Automatically identify and handle connection timeouts, domain not found, etc.
  • Content validation: Check response content type, filter non-HTML content
  • File size limits: Limit single page maximum to 10MB
  • Rate limiting: Request interval limits to prevent excessive requests

User-Friendly Feedback

  • Real-time progress: Show current processing page and progress
  • Interrupt handling: Support Ctrl+C graceful stop
  • Detailed error information: Provide specific error reasons and suggestions

Usage Recommendations

If you encounter crawling issues, try:

# Reduce crawl depth
vibelinter check https://example.com --depth 1

# Increase request interval
vibelinter check https://example.com --config custom-config.cjs

# Adjust concurrency
vibelinter check https://example.com --concurrency 3

In custom-config.cjs:

module.exports = {
  crawler: {
    rateLimit: 1000, // 1 second interval
    timeout: 10000,   // 10 second timeout
    concurrency: 3   // 3 concurrent connections
  }
}

License

MIT

Contributing

Issues and Pull Requests are welcome!

Changelog

v0.0.2

  • Enhanced error handling and timeout mechanisms
  • Added agent mode for AI integration
  • Improved cross-platform compatibility
  • Added comprehensive test suite
  • Enhanced social media validation
  • Added accessibility checks
  • Improved mobile SEO validation

v0.0.1

  • Initial release
  • Basic SEO checking functionality
  • Multiple output formats support
  • HTTP server mode support