react-native-camera-vision-pixel-colors
v1.1.0
Published
Vision Camera Frame Processor plugin for real-time pixel color analysis - extract dominant colors, brightness, ROI, motion detection
Maintainers
Readme
react-native-camera-vision-pixel-colors
High-performance Vision Camera Frame Processor for React Native (Expo compatible) that analyzes pixel colors in real time. This plugin extracts:
- Up to 10 most frequent colors (RGB + optional HSV) (configurable)
- Up to 10 brightest colors (RGB + optional HSV) (configurable)
- Total number of unique colors
- ROI analysis (configurable region)
- Motion detection (frame diff)
- HSV color space conversion (optional)
- Pixel threshold filtering (filter out colors below % threshold)
It is implemented using Nitro Modules and runs synchronously on the native thread for use as a Vision Camera frame processor, while also exposing an async Nitro API for offline image analysis.
Features
- Real-time processing (frame processor, synchronous)
- Color frequency analysis
- Brightness-based color ranking
- ROI analysis (configurable region)
- Motion detection (frame diff)
- HSV color space conversion (h: 0-360, s: 0-100, v: 0-100)
- Pixel threshold filtering (ignore colors below % of total pixels)
- Works directly on camera frames (Vision Camera)
- Written in Swift (iOS) and Kotlin (Android)
- Expo compatible via Config Plugin
- Minimal JS bridge overhead during processing
Requirements
- React Native >= 0.81
- Expo >= 54
- react-native-vision-camera >= 4.x
- react-native-nitro-modules (for Nitro API)
- iOS >= 15.1, Android >= 26
This plugin is built to be used as a Vision Camera frame processor. It does NOT process images from gallery or URLs via the frame-processor path — use the async Nitro API for that.
Install (example)
npx expo install react-native-vision-camera
npm install react-native-camera-vision-pixel-colorsAdd to app.json (or app.config.js) plugins:
{
"expo": {
"plugins": [
"react-native-vision-camera",
"react-native-camera-vision-pixel-colors"
]
}
}Then:
npx expo prebuild
eas build -p allUsage
Basic Frame Processor
import { useFrameProcessor } from 'react-native-vision-camera';
import { analyzePixelColors, type PixelColorsResult } from 'react-native-camera-vision-pixel-colors';
const frameProcessor = useFrameProcessor((frame) => {
'worklet';
const result: PixelColorsResult = analyzePixelColors(frame);
// result => { uniqueColorCount, topColors: [{r,g,b}], brightestColors: [{r,g,b}] }
console.log(result);
}, []);Attach to <Camera /> as frameProcessor.
Advanced Options
Pass optional analysis options to enable additional features:
import { useFrameProcessor } from 'react-native-vision-camera';
import { analyzePixelColors, type AnalysisOptions } from 'react-native-camera-vision-pixel-colors';
const frameProcessor = useFrameProcessor((frame) => {
'worklet';
const options: AnalysisOptions = {
// Analyze only center 20% of frame
roi: { x: 0.4, y: 0.4, width: 0.2, height: 0.2 },
// Enable motion detection
enableMotionDetection: true,
motionThreshold: 0.1, // 0-1, default: 0.1
// Configure color counts (1-10, default: 3)
maxTopColors: 5,
maxBrightestColors: 5,
// Enable HSV color space conversion
enableHsvAnalysis: true,
// Filter out colors below 0.2% of total pixels
minPixelThreshold: 0.002,
};
const result = analyzePixelColors(frame, options);
if (result.motion?.hasMotion) {
console.log('Motion detected!', result.motion.score);
}
// Access HSV values (when enableHsvAnalysis=true)
const topColor = result.topColors[0];
if (topColor?.hsv) {
console.log(`Hue: ${topColor.hsv.h}, Sat: ${topColor.hsv.s}, Val: ${topColor.hsv.v}`);
}
// Access pixel percentage (when minPixelThreshold is set)
if (topColor?.pixelPercentage) {
console.log(`This color represents ${topColor.pixelPercentage * 100}% of pixels`);
}
}, []);Async Nitro API (outside camera)
import { CameraVisionPixelColors, type ImageData } from 'react-native-camera-vision-pixel-colors';
const imageData: ImageData = { width, height, data: arrayBuffer }; // data: ArrayBuffer (RGBA)
const result = await CameraVisionPixelColors.analyzeImageAsync(imageData);Output format
All types are exported from the library:
import {
type RGBColor,
type HSVColor,
type ColorInfo,
type PixelColorsResult,
type ImageData,
type AnalysisOptions,
type ROIConfig,
type MotionResult,
} from 'react-native-camera-vision-pixel-colors';type RGBColor = { r: number; g: number; b: number };
type HSVColor = {
h: number; // 0-360 (hue)
s: number; // 0-100 (saturation)
v: number; // 0-100 (value/brightness)
};
type ColorInfo = {
r: number;
g: number;
b: number;
hsv?: HSVColor; // present when enableHsvAnalysis=true
pixelPercentage?: number; // 0-1, present when minPixelThreshold is set
};
type ROIConfig = {
x: number; // 0-1 normalized
y: number; // 0-1 normalized
width: number; // 0-1 normalized
height: number; // 0-1 normalized
};
type AnalysisOptions = {
enableMotionDetection?: boolean; // default: false
motionThreshold?: number; // default: 0.1
roi?: ROIConfig; // if provided, analyze only this region
maxTopColors?: number; // default: 3, range: 1-10
maxBrightestColors?: number; // default: 3, range: 1-10
enableHsvAnalysis?: boolean; // default: false
minPixelThreshold?: number; // 0-1, e.g., 0.002 = 0.2%
};
type MotionResult = {
score: number; // 0-1
hasMotion: boolean; // score > threshold
};
type PixelColorsResult = {
uniqueColorCount: number;
topColors: ColorInfo[]; // extends RGBColor with optional hsv/pixelPercentage
brightestColors: ColorInfo[]; // extends RGBColor with optional hsv/pixelPercentage
motion?: MotionResult; // always present if enableMotionDetection=true
roiApplied?: boolean; // true if ROI config was provided
totalPixelsAnalyzed?: number; // present when HSV or threshold enabled
};
type ImageData = {
width: number;
height: number;
data: ArrayBuffer; // RGBA pixel data
};Architecture summary
- Frame Processor path: synchronous, returns the latest cached result (0–1 frame latency).
- Async Nitro API: full GPU/CPU pipeline, returns an up-to-date result (Promise-based).
- Shared native engine (iOS/Android) exposes
analyzeAsync(...)andanalyzeSync()for the frame-processor path to read cached results.
Memory & Performance Notes
- Motion detection: Uses grayscale comparison with configurable threshold
- ROI: Crops before analysis for improved performance on smaller regions
- First frame motion: Returns
{score: 0, hasMotion: false}(not null) - HSV analysis: Minimal overhead, computed only for returned colors (not all pixels)
- Pixel threshold: Filters noise by ignoring colors below specified % of total pixels
Contributing
PRs welcome. Please keep performance constraints in mind (avoid allocations per frame, reuse buffers).
License
MIT © 2026
