rn-breed-detector
v3.0.2
Published
Offline TFLite cattle breed detection for React Native
Maintainers
Readme
rn-breed-detector
Offline TensorFlow Lite cattle breed detection for React Native
A lightweight, production-ready NPM package that performs offline cattle breed classification using TensorFlow Lite. Simply pass an image URI and get instant breed predictions with confidence scores — no internet required.
Features
✅ Fully Offline — Model runs entirely on-device using TensorFlow Lite
✅ Production-Ready Pipeline — Uses expo-image-manipulator and jpeg-js for reliable preprocessing
✅ Model Caching — Loads model once and reuses for subsequent predictions
✅ Pure JavaScript — Image resizing and JPEG decoding without additional native dependencies
✅ Softmax Normalization — Returns calibrated probability scores
✅ TypeScript Ready — Includes type definitions
Installation
npm install rn-breed-detector react-native-fast-tflite expo-image-manipulatorDependencies
This package requires:
react-native-fast-tflite(v0.2.0+) — TensorFlow Lite inference engineexpo-image-manipulator(v11.0.0+ or v12.0.0+) — Image resizing and JPEG encodingjpeg-js(v0.4.4+) — JPEG decoding (automatically installed)
npm install react-native-fast-tflite expo-image-manipulatorSetup
1. Metro Configuration (Required)
TensorFlow Lite models use the .tflite extension. You must configure Metro bundler to recognize this asset type.
Create or edit metro.config.js in your project root:
const { getDefaultConfig } = require("metro-config");
module.exports = (async () => {
const config = await getDefaultConfig();
// Add .tflite as a recognized asset extension
config.resolver.assetExts.push("tflite");
return config;
})();Why is this needed?
Metro doesn't bundle .tflite files by default. This configuration tells Metro to treat .tflite files as assets that can be required/imported.
2. iOS Setup
cd ios
pod install
cd ..3. Expo Compatibility
⚠️ Expo Managed Workflow is NOT supported because this package requires native modules (react-native-fast-tflite).
Supported Workflows:
- ✅ React Native CLI
- ✅ Expo Bare Workflow (after running
expo prebuild)
Usage
Basic Example
import { detectBreed } from "rn-breed-detector";
import { launchImageLibrary } from "react-native-image-picker";
async function pickAndDetect() {
// Pick an image
const result = await launchImageLibrary({
mediaType: "photo",
quality: 1,
});
if (result.assets && result.assets[0]) {
const imageUri = result.assets[0].uri;
// Detect breed
const prediction = await detectBreed(imageUri);
console.log("Breed:", prediction.breed);
console.log("Confidence:", prediction.confidence);
console.log("All scores:", prediction.scores);
}
}Complete Component Example
import React, { useState } from "react";
import { View, Button, Text, Image, ActivityIndicator } from "react-native";
import { detectBreed } from "rn-breed-detector";
import { launchImageLibrary } from "react-native-image-picker";
export default function BreedDetector() {
const [imageUri, setImageUri] = useState(null);
const [result, setResult] = useState(null);
const [loading, setLoading] = useState(false);
const handlePickImage = async () => {
const pickerResult = await launchImageLibrary({
mediaType: "photo",
quality: 1,
});
if (pickerResult.assets && pickerResult.assets[0]) {
const uri = pickerResult.assets[0].uri;
setImageUri(uri);
// Run detection
setLoading(true);
try {
const prediction = await detectBreed(uri);
setResult(prediction);
} catch (error) {
console.error("Detection failed:", error);
} finally {
setLoading(false);
}
}
};
return (
<View style={{ padding: 20 }}>
<Button title="Pick Image" onPress={handlePickImage} />
{imageUri && (
<Image
source={{ uri: imageUri }}
style={{ width: 300, height: 300, marginTop: 20 }}
/>
)}
{loading && <ActivityIndicator size="large" />}
{result && (
<View style={{ marginTop: 20 }}>
<Text style={{ fontSize: 18, fontWeight: "bold" }}>
Breed: {result.breed}
</Text>
<Text>Confidence: {(result.confidence * 100).toFixed(2)}%</Text>
<Text style={{ marginTop: 10, fontWeight: "bold" }}>All Predictions:</Text>
{result.scores.slice(0, 3).map((item, idx) => (
<Text key={idx}>
{item.breed}: {(item.score * 100).toFixed(2)}%
</Text>
))}
</View>
)}
</View>
);
}API Reference
detectBreed(imageUri)
Performs breed detection on an image.
Parameters:
imageUri(string, required) — Local file URI (e.g.,file:///path/to/image.jpg)
Returns: Promise<DetectionResult>
interface DetectionResult {
breed: string; // The predicted breed label
confidence: number; // Confidence score [0, 1] for the top prediction
scores: Array<{ // All breeds sorted by score (descending)
breed: string;
score: number; // Probability [0, 1]
}>;
}Example Output:
{
"breed": "Sahiwal",
"confidence": 0.89,
"scores": [
{ "breed": "Sahiwal", "score": 0.89 },
{ "breed": "Gir", "score": 0.06 },
{ "breed": "Tharparkar", "score": 0.03 },
...
]
}unloadModel()
Unloads the TFLite model from memory (optional cleanup).
import { unloadModel } from "rn-breed-detector";
// When you're done with predictions
unloadModel();Supported Breeds
The default model recognizes these 10 cattle breeds:
- Sahiwal
- Gir
- Tharparkar
- Red Sindhi
- Rathi
- Ongole
- Kankrej
- Hariana
- Deoni
- Kangayam
You can customize the model and labels (see Customization below).
How It Works (Internals)
Model Bundling
The .tflite model file is bundled inside the NPM package at src/model/model.tflite.
When you call detectBreed(), the package:
- Loads the model using
react-native-fast-tflite(cached after first load) - Preprocesses the image — resizes to 300×300, converts to RGB Float32
- Runs inference on-device
- Applies softmax to convert logits to probabilities
- Returns results with breed labels and confidence scores
Why .tflite Asset Extension is Required
Metro bundler doesn't recognize .tflite files by default. The metro.config.js modification allows:
- Bundling
.tflitefiles as static assets - Using
require()to reference the model path - Proper packaging for iOS/Android
Model Caching
The model is loaded once on the first call to detectBreed() and cached in memory. Subsequent calls reuse the loaded interpreter, providing fast inference times.
Preprocessing Pipeline (Production Implementation)
The preprocessing pipeline transforms camera images into tensors using a reliable, tested approach:
Step 1: Resize to 300×300
- Uses
expo-image-manipulatorto resize and encode as JPEG - Returns Base64-encoded string
Step 2: Base64 to Binary
- Decodes Base64 string using native
atob() - Converts to Uint8Array
Step 3: JPEG Decoding
- Uses
jpeg-jslibrary to decode JPEG binary - Produces RGBA pixel data (Uint8Array)
Step 4: RGB Extraction
- Iterates through RGBA data with stride of 4
- Extracts R, G, B channels (drops Alpha)
- Creates Float32Array of size 300×300×3 = 270,000
Step 5: Tensor Format
- Output:
Float32Arraywith pixel values 0-255 (NOT normalized) - Shape:
[1, 300, 300, 3] - Channel order: RGB (not BGR)
- Memory layout: Row-major, flat buffer
[R, G, B, R, G, B, ...]
CRITICAL: Pixel values are kept in the range 0-255, not normalized to [0, 1]. Ensure your model expects this input format.
Customization
Updating the Model
To use your own trained model:
- Train your model (TensorFlow/Keras, PyTorch → ONNX → TFLite, etc.)
- Convert to TensorFlow Lite format
- Replace
src/model/model.tflitewith your.tflitefile - Update
src/labels.jsonwith your class labels - Rebuild the package
Model Requirements:
- Input shape:
[1, 300, 300, 3] - Input type:
Float32 - Input value range:
0-255(NOT normalized to [0, 1]) - Output shape:
[1, num_classes]wherenum_classesmatches labels.json length - Output type:
Float32(logits, softmax applied internally)
Updating Labels
Edit src/labels.json:
[
"YourBreed1",
"YourBreed2",
"YourBreed3"
]Important: The order must match your model's output classes.
Example: Converting a Keras Model to TFLite
import tensorflow as tf
# Load your trained model
model = tf.keras.models.load_model('cattle_classifier.h5')
# Convert to TensorFlow Lite
converter = tf.lite.TFLiteConverter.from_keras_model(model)
# Optional: Apply optimizations
converter.optimizations = [tf.lite.Optimize.DEFAULT]
# Convert
tflite_model = converter.convert()
# Save
with open('model.tflite', 'wb') as f:
f.write(tflite_model)Building & Publishing
Build the Package
npm run buildThis transpiles src/ to dist/ using Babel and copies asset files.
Publish to NPM
npm publish --access publicVersioning the Model
When you update the model or labels:
- Increment the package version in
package.json(e.g.,1.0.0→1.1.0) - Rebuild:
npm run build - Publish:
npm publish
Users can then upgrade to get the new model:
npm install rn-breed-detector@latestTroubleshooting
Error: "Unable to resolve module './model.tflite'"
Cause: Metro config not set up correctly.
Solution:
- Ensure
metro.config.jsincludes.tfliteinassetExts - Restart Metro bundler:
npx react-native start --reset-cache
Error: "Cannot find module 'expo-image-manipulator'"
Cause: Missing peer dependency.
Solution:
npm install expo-image-manipulatorModel Not Loading on iOS
Cause: Pods not installed or linked.
Solution:
cd ios
pod install
cd ..
npx react-native run-iosError: "Image preprocessing failed"
Cause: Invalid image URI or unsupported image format.
Solution:
- Ensure the URI is a valid local file path (starts with
file://) - Use JPEG or PNG images
- Check that the image file exists and is readable npx react-native run-ios
### Slow First Prediction
**Cause:** Model loading takes time on first call.
**Solution:** This is expected. The model is cached after the first load, so subsequent predictions are fast. You can preload the model on app startup:
```javascript
import { detectBreed } from "rn-breed-detector";
// Preload model on app start
useEffect(() => {
detectBreed("dummy_uri").catch(() => {
// Model is now cached
});
}, []);Performance
- Model Size: ~4-10 MB (depending on architecture and quantization)
- First Load: ~200-500ms (model initialization)
- Subsequent Predictions: ~50-200ms (inference only)
- Memory: ~20-50 MB (model in memory)
License
MIT
Contributing
Contributions welcome! Please open an issue or PR on GitHub.
Acknowledgments
- Built with react-native-fast-tflite
- Uses TensorFlow Lite for on-device inference
- Designed for livestock management and agricultural applications
Roadmap
- [ ] Add support for video inference
- [ ] Optimize preprocessing with native modules
- [ ] Support quantized models (Int8)
- [ ] Add confidence thresholding
- [ ] Multi-language label support
- [ ] Model versioning API
Happy cattle detecting! 🐄
