react-native-image-insights
v0.4.1
Published
React Native camera + on-device ML helper that captures an image and returns physical property estimates.
Maintainers
Readme
react-native-image-insights
📸 React Native package for on-device image analysis using ML. Capture an image and get object classification with physical property estimates (dimensions, weight, etc.).
Features
- 🎯 99%+ accuracy - Google Cloud Vision API integration (recommended)
- 🧠 On-device ML - EfficientNet-B0 model bundled for offline use (77% accuracy)
- 📷 Camera integration - Built-in support for react-native-vision-camera
- 📦 Physical estimates - Get dimensions, weight, and volume estimates for detected objects
- ⚡ Zero config - Model loads automatically, no external storage needed
Installation
# Install the package
npx expo install react-native-image-insights
# Install peer dependencies
npx expo install \
react-native-vision-camera \
onnxruntime-react-native \
@shopify/react-native-skia \
react-native-fsQuick Setup
1. Configure Metro
Create/update metro.config.js:
const { getDefaultConfig } = require("expo/metro-config");
const config = getDefaultConfig(__dirname);
config.resolver.assetExts.push("onnx");
module.exports = config;2. Add Plugins (app.json)
{
"expo": {
"plugins": [
[
"react-native-vision-camera",
{
"cameraPermissionText": "Allow camera access to analyze images"
}
],
"react-native-image-insights"
]
}
}3. Build
npx expo prebuild
npx expo run:ios # or run:androidUsage
Option 1: Google Cloud Vision (Recommended - 99%+ Accuracy)
import { configureCloudVision, getImageProperties } from "react-native-image-insights";
// Configure once at app startup
configureCloudVision({
apiKey: 'YOUR_GOOGLE_CLOUD_API_KEY', // Get from Google Cloud Console
maxResults: 10,
});
// Analyze an image - Cloud Vision is used automatically!
const result = await getImageProperties({ uri: "file:///path/to/photo.jpg" });
console.log(result.name); // "laptop", "iPhone", "coffee mug" - practical labels!
console.log(result.confidence); // 0.98
console.log(result.weightKg); // 1.5
console.log(result.lengthCm); // 32Get your API key: See CLOUD_VISION_SETUP.md for detailed setup instructions.
Option 2: Local Model (Offline - 77% Accuracy)
import { getImageProperties } from "react-native-image-insights";
// Use local model (works offline, no API key needed)
const result = await getImageProperties({
uri: "file:///path/to/photo.jpg",
useCloudVision: false // Force local model
});
console.log(result.name); // "laptop"
console.log(result.confidence); // 0.87API
getImageProperties(params)
Analyzes an image and returns physical property estimates.
interface GetImagePropertiesParams {
uri: string; // Image file URI
fileSizeBytes?: number; // Optional file size
capturedAt?: string; // Optional capture timestamp
modelUrl?: string; // Optional custom model URL
}
interface ImagePropertyEstimate {
name: string; // Detected object name
confidence: number; // Prediction confidence (0-1)
weightKg: number; // Estimated weight in kg
sizeCm3: number; // Estimated volume in cm³
sizeLiters: number; // Estimated volume in liters
lengthCm: number; // Estimated length in cm
heightCm: number; // Estimated height in cm
widthCm: number; // Estimated width in cm
fragile: boolean; // Whether item is likely fragile
labels: Array<{
// Top 5 predictions
label: string;
confidence: number;
}>;
metadata: {
widthPx: number;
heightPx: number;
fileSizeBytes: number;
uri: string;
capturedAt: string;
};
}setModelAsset(url)
Use a custom ONNX model instead of the bundled one.
import { setModelAsset } from "react-native-image-insights";
setModelAsset("https://your-cdn.com/custom-model.onnx");useBundledModel()
Switch back to the bundled model.
import { useBundledModel } from "react-native-image-insights";
useBundledModel();Model Details
| Property | Value | | -------------- | --------------- | | Architecture | EfficientNet-B0 | | Size | 20 MB | | Top-1 Accuracy | 77.1% | | Top-5 Accuracy | 93.3% | | Classes | 1000 (ImageNet) | | Input Size | 224×224 |
Detectable Objects
The model can identify 1000+ categories including:
- Electronics: phones, laptops, cameras, keyboards, monitors
- Furniture: chairs, tables, sofas, beds, desks
- Kitchen: cups, bottles, pans, appliances, utensils
- Vehicles: cars, bikes, trucks, motorcycles
- Food: fruits, vegetables, meals, drinks
- Animals: dogs, cats, birds, fish
- Clothing: shoes, bags, hats, shirts
- Sports: balls, rackets, bikes, equipment
Expo Setup
For detailed Expo setup instructions, see EXPO_SETUP.md.
Requirements
- React Native 0.72+
- Expo SDK 50+ (with development build)
- iOS 13+ / Android API 24+
Peer Dependencies
{
"react-native-vision-camera": ">=3.0.0",
"onnxruntime-react-native": ">=1.17.0",
"@shopify/react-native-skia": ">=0.1.214",
"react-native-fs": ">=2.20.0"
}Troubleshooting
Model not loading
- Ensure
metro.config.jshasassetExts.push('onnx') - Restart Metro:
npx expo start --clear - Rebuild:
npx expo run:ios
Camera not working
- Check permissions in device settings
- Add vision-camera plugin to
app.json - Run
npx expo prebuild
Low accuracy
- Use good lighting
- Center object in frame
- Avoid blurry images
License
MIT
Contributing
Contributions welcome! Please read our contributing guidelines first.
