npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@jaak.ai/visage

v1.1.0

Published

Jaak Face Identifier

Downloads

477

Readme

@jaak.ai/visage

npm version license

Versión en Español

Description

Jaak Visage is a web component for face identification and biometric verification. It provides real-time face detection, optimal positioning guidance, and automatic video capture when facial biometrics meet specified quality criteria.

Key Features

  • Real-time face detection and tracking
  • Automatic camera access and management
  • Multi-camera support with automatic switching
  • Optimal positioning guidance with visual feedback
  • Quality-based video capture (area, rotation, stability)
  • Base64 video output for easy integration
  • Responsive design with mobile support

Typical Use Cases

  • Identity verification systems
  • Biometric authentication workflows
  • Document verification processes
  • User onboarding with identity validation
  • Security access control systems
  • Remote identity confirmation

Installation

NPM/Yarn

npm install @jaak.ai/visage
yarn add @jaak.ai/visage

CDN

<script type="module" src="https://unpkg.com/@jaak.ai/visage@latest/dist/jaak-visage-webcomponent/jaak-visage-webcomponent.esm.js"></script>

API Reference

Props/Attributes

License Properties (Required)

| Property | Type | Default | Description | |----------|------|---------|-------------| | license | string | undefined | [REQUIRED] License key for component authentication | | licenseEnvironment | 'dev' \| 'qa' \| 'sandbox' \| 'prod' | 'prod' | API environment for license validation | | appId | string | 'jaak-visage-web' | Application identifier for analytics and tracking | | traceId | string | undefined | Optional trace ID for distributed tracing (auto-generated if not provided) |

Base Configuration Properties

| Property | Type | Default | Description | |----------|------|---------|-------------| | debug | boolean | false | Enables debug logging and visual debugging information | | camera | 'front' \| 'back' \| 'auto' | 'auto' | Camera preference for initial selection | | detectionMode | 'full' \| 'light' \| 'auto' | 'auto' | Face detection performance mode for different device capabilities | | captureMode | 'auto' \| 'manual' | 'auto' | Capture mode: auto for automatic detection, manual for user-triggered capture | | language | 'es' \| 'en' | 'es' | Language for UI messages and instructions (Spanish or English) | | showHelpOnStart | boolean | false | Automatically shows help instructions when detection starts | | texts | VisageTexts | See below | Custom texts for UI messages and instructions |

Telemetry and OpenTelemetry Properties

| Property | Type | Default | Description | |----------|------|---------|-------------| | enableTelemetry | boolean | true | Enables distributed tracing with OpenTelemetry | | telemetryCollectorUrl | string | 'https://collector.jaak.ai/v1/traces' | OTLP collector URL for distributed traces | | enableMetrics | boolean | true | Enables metrics export to OpenTelemetry | | metricsCollectorUrl | string | 'https://collector.jaak.ai/v1/metrics' | OTLP collector URL for metrics | | metricsExportIntervalMillis | number | 60000 | Metrics export interval in milliseconds | | propagateTraceHeaderCorsUrls | string | undefined | Comma-separated URLs for W3C Trace Context header propagation |

Context Properties

| Property | Type | Default | Description | |----------|------|---------|-------------| | customerId | string | undefined | Customer ID for telemetry and analytics | | tenantId | string | undefined | Tenant ID for multi-tenancy support | | environment | string | 'production' | Execution environment (development/staging/production) |

Detection Modes

The detectionMode property allows you to optimize performance based on device capabilities:

  • 'full': Maximum accuracy with custom high-precision model and WebGL backend. Best for desktop computers and high-end devices.
  • 'light': Optimized for mobile devices with limited resources. Uses lighter model, CPU backend, reduced resolution (640px), and frame skipping for improved performance.
  • 'auto' (Default): Automatically detects device capabilities (memory, CPU cores, mobile detection) and selects the optimal mode. Chooses 'light' for devices with ≤3GB RAM, ≤4 CPU cores, or mobile devices.

Performance Comparison:

  • Reduced memory usage through lower resolution processing
  • Better battery life on mobile devices
  • Maintains good detection accuracy for most use cases

Language Support

The language property allows you to set the language for all UI messages and instructions:

  • 'es' (Default): Spanish - All texts are displayed in Spanish
  • 'en': English - All texts are displayed in English

Usage:

<!-- Spanish (default) -->
<jaak-visage language="es"></jaak-visage>

<!-- English -->
<jaak-visage language="en"></jaak-visage>

The language setting affects all built-in texts including instructions, error messages, position feedback, and button labels. You can still override specific texts using the texts property if you need custom translations or variations.

Text Customization

The texts property allows you to customize all user-facing messages and instructions. You can provide a partial configuration object, and any missing properties will use the default values based on the selected language.

TypeScript Interface:

interface VisageTexts {
  manualInstructions?: {
    modeTitle?: string;               // Default: "Modo de captura manual"
    modeDescription?: string;          // Default: "En modo manual, posicione su rostro..."
    positionPrompt?: string;          // Default: "Posicione su rostro dentro del marco..."
    lowResourcePrompt?: string;       // Default: "Dispositivo de bajos recursos detectado..."
    modelFailurePrompt?: string;      // Default: "No se pudo cargar el modelo de detección..."
    startRecordingButton?: string;    // Default: "Iniciar grabación"
    recIndicator?: string;            // Default: "REC"
  };
  positionFeedback?: {
    noFace?: string;                  // Default: "Posicione su rostro dentro del marco..."
    multipleFaces?: string;           // Default: "Asegúrese de que solo una persona..."
    tooClose?: string;                // Default: "Aléjese un poco de la cámara"
    tooFar?: string;                  // Default: "Acérquese más a la cámara"
    notCentered?: string;             // Default: "Centre su rostro y mire directamente..."
    perfect?: string;                 // Default: "Posición perfecta. Puede capturar ahora"
  };
  errorMessages?: {
    noFaceDetected?: string;          // Default: "Posicione su rostro dentro del marco..."
    multipleFacesDetected?: string;   // Default: "Se detectaron múltiples rostros..."
    faceNotVisible?: string;          // Default: "Acérquese más a la cámara..."
    faceTooCLose?: string;            // Default: "Aléjese un poco de la cámara..."
    faceNotCentered?: string;         // Default: "Mantenga su rostro centrado..."
    noFaceWithLighting?: string;      // Default: "No se detectó ningún rostro..."
  };
  cameraMenu?: {
    selectCamera?: string;            // Default: "Select Camera"
    cameraLabel?: string;             // Default: "Camera"
  };
}

Example Usage (Partial Configuration):

const customTexts = {
  manualInstructions: {
    startRecordingButton: "Start Recording",
    recIndicator: "REC"
  },
  positionFeedback: {
    perfect: "Perfect position. You can capture now",
    tooFar: "Move closer to the camera"
  }
};

// Apply to component
detector.texts = customTexts;

Methods

| Method | Parameters | Returns | Description | |--------|------------|---------|-------------| | start() | none | Promise<void> | Starts the camera and face detection | | stop() | none | Promise<void> | Stops the camera and clears detections | | restart() | none | Promise<void> | Restarts the component (stop + start) | | preload() | none | Promise<void> | Preloads ML models without starting camera | | startCamera() | none | Promise<void> | Starts camera stream only | | stopCamera() | none | Promise<void> | Stops camera stream only | | switchCamera(deviceId?) | deviceId?: string | Promise<void> | Switches to specified camera or cycles through available | | showHelp() | none | Promise<void> | Shows help instructions overlay | | getCameraDevices() | none | Promise<MediaDeviceInfo[]> | Gets list of available camera devices | | getAvailableCameras() | none | Promise<MediaDeviceInfo[]> | Returns cached available cameras | | getCurrentCameraId() | none | Promise<string> | Returns currently selected camera device ID | | captureNow() | none | Promise<void> | Triggers manual capture immediately (manual mode only) |

Events

| Event | Payload | Description | |-------|---------|-------------| | videoCaptured | {base64: string} | Fired when video capture is completed with base64 data | | statusUpdated | {status: string, message: string} | Fired when component status changes | | traceIdGenerated | {traceId: string} | Fired after license validation with the generated or provided trace ID for request tracing |

CSS Custom Properties

| Property | Default | Description | |----------|---------|-------------| | --jaak-visage-primary-color | #5cb85c | Primary color for successful states | | --jaak-visage-error-color | #ff0000 | Color for error states and warnings | | --jaak-visage-background | #000000 | Component background color | | --jaak-visage-text-color | #ffffff | Text color for overlays | | --jaak-visage-overlay-bg | rgba(0,0,0,0.8) | Background for instruction overlays |

Implementation Examples

Vanilla JavaScript

<!DOCTYPE html>
<html>
<head>
    <script type="module" src="https://unpkg.com/@jaak.ai/visage@latest/dist/jaak-visage-webcomponent/jaak-visage-webcomponent.esm.js"></script>
</head>
<body>
    <jaak-visage
        id="faceDetector"
        license="your-license-key-here"
        environment="prod"
        app-id="my-custom-app"
        debug="true"
        detection-mode="auto">
    </jaak-visage>

    <script>
        const detector = document.getElementById('faceDetector');

        // Handle trace ID generation
        detector.addEventListener('traceIdGenerated', (event) => {
            console.log('Trace ID for this session:', event.detail.traceId);
            // Use this trace ID for correlating requests across your system
        });

        // Listen for video capture
        detector.addEventListener('videoCaptured', (event) => {
            console.log('Video captured:', event.detail.base64);
            // Send base64 to your server
        });

        // Listen for status updates
        detector.addEventListener('statusUpdated', (event) => {
            console.log('Status:', event.detail.status, event.detail.message);
        });

        // Start face detection
        detector.start();
    </script>
</body>
</html>

React

import React, { useRef, useEffect } from 'react';

// Import the component
import { defineCustomElements } from '@jaak.ai/visage/loader';

// Register the custom elements
defineCustomElements();

interface JaakVisageElement extends HTMLElement {
    start: () => Promise<void>;
    stop: () => Promise<void>;
    restart: () => Promise<void>;
    debug: boolean;
    camera: 'front' | 'back' | 'auto';
    detectionMode: 'full' | 'light' | 'auto';
}

const FaceVerification: React.FC = () => {
    const visageRef = useRef<JaakVisageElement>(null);

    useEffect(() => {
        const handleTraceIdGenerated = (event: CustomEvent) => {
            console.log('Trace ID:', event.detail.traceId);
        };

        const handleVideoCapture = (event: CustomEvent) => {
            console.log('Video captured:', event.detail.base64);
            // Process the base64 video data
        };

        const handleStatusUpdate = (event: CustomEvent) => {
            console.log('Status update:', event.detail);
        };

        const element = visageRef.current;
        if (element) {
            element.addEventListener('traceIdGenerated', handleTraceIdGenerated);
            element.addEventListener('videoCaptured', handleVideoCapture);
            element.addEventListener('statusUpdated', handleStatusUpdate);

            // Start the component
            element.start();
        }

        return () => {
            if (element) {
                element.removeEventListener('traceIdGenerated', handleTraceIdGenerated);
                element.removeEventListener('videoCaptured', handleVideoCapture);
                element.removeEventListener('statusUpdated', handleStatusUpdate);
                element.stop();
            }
        };
    }, []);

    return (
        <div>
            <h2>Face Verification</h2>
            <jaak-visage
                ref={visageRef}
                license="your-license-key-here"
                environment="prod"
                app-id="my-react-app"
                trace-id=""  // Leave empty to auto-generate
                debug={false}
                camera="front"
                detection-mode="auto"
            />
        </div>
    );
};

export default FaceVerification;

Angular

// app.module.ts
import { NgModule, CUSTOM_ELEMENTS_SCHEMA } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';

import { AppComponent } from './app.component';

// Import the component loader
import { defineCustomElements } from '@jaak.ai/visage/loader';

// Register custom elements
defineCustomElements();

@NgModule({
  declarations: [AppComponent],
  imports: [BrowserModule],
  providers: [],
  bootstrap: [AppComponent],
  schemas: [CUSTOM_ELEMENTS_SCHEMA] // Allow custom elements
})
export class AppModule { }
// face-verification.component.ts
import { Component, ElementRef, ViewChild, AfterViewInit } from '@angular/core';

@Component({
  selector: 'app-face-verification',
  template: `
    <div>
      <h2>Face Verification</h2>
      <jaak-visage
        #visage
        license="your-license-key-here"
        environment="prod"
        app-id="my-angular-app"
        [debug]="false"
        camera="front"
        detection-mode="auto"
        (traceIdGenerated)="onTraceIdGenerated($event)"
        (videoCaptured)="onVideoCapture($event)"
        (statusUpdated)="onStatusUpdate($event)">
      </jaak-visage>
      <button (click)="startVerification()">Start</button>
      <button (click)="stopVerification()">Stop</button>
    </div>
  `
})
export class FaceVerificationComponent implements AfterViewInit {
  @ViewChild('visage', { static: false }) visage!: ElementRef<any>;

  ngAfterViewInit() {
    // Component is ready
  }

  startVerification() {
    this.visage.nativeElement.start();
  }

  stopVerification() {
    this.visage.nativeElement.stop();
  }

  onTraceIdGenerated(event: CustomEvent) {
    console.log('Trace ID:', event.detail.traceId);
  }

  onVideoCapture(event: CustomEvent) {
    console.log('Video captured:', event.detail.base64);
    // Process the captured video
  }

  onStatusUpdate(event: CustomEvent) {
    console.log('Status:', event.detail.status, event.detail.message);
  }
}

Style Customization

CSS Variables

jaak-visage {
  --jaak-visage-primary-color: #007bff;
  --jaak-visage-error-color: #dc3545;
  --jaak-visage-background: #f8f9fa;
  --jaak-visage-text-color: #333333;
  --jaak-visage-overlay-bg: rgba(255,255,255,0.9);
}

Custom Styling

/* Container styling */
jaak-visage {
  width: 100%;
  max-width: 640px;
  border-radius: 8px;
  box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
}

/* Mobile responsiveness */
@media (max-width: 768px) {
  jaak-visage {
    width: 100vw;
    height: 100vh;
  }
}

Browser Compatibility

| Browser | Version | Notes | |---------|---------|-------| | Chrome | 88+ | Full support | | Firefox | 85+ | Full support | | Safari | 14+ | Full support | | Edge | 88+ | Full support | | Mobile Safari | 14+ | Requires HTTPS | | Chrome Mobile | 88+ | Full support |

Requirements

  • Modern browser with WebRTC support
  • Camera access permissions
  • HTTPS connection (required for camera access)
  • Sufficient lighting for face detection

Troubleshooting

Common Issues

Camera Access Denied

// Check permissions before starting
navigator.permissions.query({name: 'camera'}).then(result => {
  if (result.state === 'granted') {
    detector.start();
  } else {
    console.log('Camera permission required');
  }
});

Component Not Loading

  • Ensure you're using HTTPS
  • Check that the component is properly imported
  • Verify CUSTOM_ELEMENTS_SCHEMA is included (Angular)

Face Detection Not Working

  • Ensure adequate lighting
  • Check that only one person is visible
  • Verify camera is not blocked by other applications

Mobile Issues

  • Add playsinline attribute to video elements
  • Ensure viewport meta tag is configured
  • Test on actual devices, not just browser dev tools

FAQ

Q: Can I use multiple instances on the same page? A: Yes, but only one can access the camera at a time.

Q: What video formats are supported? A: The component outputs WebM format with VP9 codec by default.

Q: How do I handle the base64 video data? A: The base64 string includes the data URL prefix and can be directly used in video elements or sent to servers.

Q: Is internet connection required? A: Yes, for loading ML models from CDN. Consider implementing offline model loading for air-gapped environments.

Q: How do I optimize performance on mobile devices? A: Use detection-mode="light" or detection-mode="auto" for automatic optimization. Light mode reduces processing load by 3x while maintaining good accuracy.

Q: What's the difference between detection modes? A: Full mode uses high-precision models for maximum accuracy. Light mode uses optimized processing (lower resolution, frame skipping, lighter models) for better performance on limited devices. Auto mode automatically selects the best option based on device capabilities.


License

MIT

Copyright (c) 2024 Jaak.ai