ugx-face-liveness
v0.0.5
Published
Angular library for blink/smile liveness checks with snapshot/video capture.
Readme
UGX Face Liveness
Angular library for blink/smile liveness checks with snapshot/video capture.
Install
- Angular 16+ with Ivy; peer deps:
@angular/coreand@angular/common. - Add the package:
npm i ugx-face-liveness
Asset setup (required)
The library loads models from /assets/face-liveness/models. Add the assets block to your app (and test) builds in angular.json:
Also, load the face-api js file from https://cdn.jsdelivr.net/npm/[email protected]/dist/face-api.min.js ideally in your index.html file
Models are gotten from https://github.com/justadudewhohacks/face-api.js-models
{
"glob": "**/*",
"input": "projects/face-liveness/src/lib/assets",
"output": "assets/face-liveness"
}Basic usage (standalone)
import { Component } from "@angular/core";
import { FaceLivenessComponent } from "ugx-face-liveness";
@Component({
selector: "app-root",
standalone: true,
imports: [FaceLivenessComponent],
template: `<fl-face-liveness (livenessCompleted)="onDone($event)"></fl-face-liveness>`,
})
export class AppComponent {
onDone(result: { snapshot: Blob; video: Blob | null }) {
// handle result
}
}Basic usage (NgModule)
import { NgModule } from "@angular/core";
import { BrowserModule } from "@angular/platform-browser";
import { FaceLivenessModule } from "ugx-face-liveness";
import { AppComponent } from "./app.component";
@NgModule({
declarations: [AppComponent],
imports: [BrowserModule, FaceLivenessModule],
bootstrap: [AppComponent],
})
export class AppModule {}Then drop <fl-face-liveness></fl-face-liveness> in your templates.
Inputs
showDebug: boolean– overlay debug info; defaults tofalse.faceDetectionOptions?: { detectionInterval?: number; scoreThreshold?: number; minConfidence?: number; maxFaceAngle?: number; singleAction?: 'blink' | 'smile'; singleActionMode?: boolean; }– tune detection cadence/thresholds and optionally force a single action flow. PrefersingleAction;singleActionModeis the deprecated equivalent ofsingleAction: 'smile'.
Single-action flow
Skip the full look-straight → blink → smile sequence and require only one action:
<fl-face-liveness
[faceDetectionOptions]="{ singleAction: 'blink' }"
(livenessCompleted)="onDone($event)"
></fl-face-liveness>Valid actions: 'blink' or 'smile'. Legacy singleActionMode: true maps to singleAction: 'smile'.
Outputs
(faceDetectionStatusChange)="onFaceStatusChange(isValid: boolean)"– alignment/validation status.(errorOccurred)="onError(message: string)"– user-facing error messages.(livenessCompleted)="onDone({ snapshot: Blob, video: Blob | null })"– final media payload.
Snapshot/Video handling
Snapshot is image/jpeg; video (when available) is video/webm (vp8/vp9). Convert to base64 if needed:
const toBase64 = async (blob: Blob) => `data:${blob.type};base64,${btoa(String.fromCharCode(...new Uint8Array(await blob.arrayBuffer())))}`;Common gotchas
- Camera permission + HTTPS/localhost required.
- Ensure assets are copied (see Asset setup); missing models will block detection.
- Blink/smile stages depend on lighting and frame rate; defaults use
detectionInterval120ms and blink EAR threshold 0.25. - If video blob is null, check
MediaRecordersupport and MIME type (falls back to vp8).
Local development
- Build library:
npm run build:lib(alias forng build face-liveness). - Serve demo app:
npm start. - Path alias is configured:
"ugx-face-liveness": ["projects/face-liveness/src/public-api.ts", "dist/ugx-face-liveness"].
Publishing
After building, publish from the dist folder:
cd dist/face-liveness
npm publishContributions
If you will love to contribute, follow the git link and raise a pr
