npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

rn-breed-detector

v3.0.2

Published

Offline TFLite cattle breed detection for React Native

Readme

rn-breed-detector

Offline TensorFlow Lite cattle breed detection for React Native

A lightweight, production-ready NPM package that performs offline cattle breed classification using TensorFlow Lite. Simply pass an image URI and get instant breed predictions with confidence scores — no internet required.

Features

Fully Offline — Model runs entirely on-device using TensorFlow Lite
Production-Ready Pipeline — Uses expo-image-manipulator and jpeg-js for reliable preprocessing
Model Caching — Loads model once and reuses for subsequent predictions
Pure JavaScript — Image resizing and JPEG decoding without additional native dependencies
Softmax Normalization — Returns calibrated probability scores
TypeScript Ready — Includes type definitions


Installation

npm install rn-breed-detector react-native-fast-tflite expo-image-manipulator

Dependencies

This package requires:

  • react-native-fast-tflite (v0.2.0+) — TensorFlow Lite inference engine
  • expo-image-manipulator (v11.0.0+ or v12.0.0+) — Image resizing and JPEG encoding
  • jpeg-js (v0.4.4+) — JPEG decoding (automatically installed)
npm install react-native-fast-tflite expo-image-manipulator

Setup

1. Metro Configuration (Required)

TensorFlow Lite models use the .tflite extension. You must configure Metro bundler to recognize this asset type.

Create or edit metro.config.js in your project root:

const { getDefaultConfig } = require("metro-config");

module.exports = (async () => {
  const config = await getDefaultConfig();

  // Add .tflite as a recognized asset extension
  config.resolver.assetExts.push("tflite");

  return config;
})();

Why is this needed?
Metro doesn't bundle .tflite files by default. This configuration tells Metro to treat .tflite files as assets that can be required/imported.

2. iOS Setup

cd ios
pod install
cd ..

3. Expo Compatibility

⚠️ Expo Managed Workflow is NOT supported because this package requires native modules (react-native-fast-tflite).

Supported Workflows:

  • ✅ React Native CLI
  • ✅ Expo Bare Workflow (after running expo prebuild)

Usage

Basic Example

import { detectBreed } from "rn-breed-detector";
import { launchImageLibrary } from "react-native-image-picker";

async function pickAndDetect() {
  // Pick an image
  const result = await launchImageLibrary({
    mediaType: "photo",
    quality: 1,
  });

  if (result.assets && result.assets[0]) {
    const imageUri = result.assets[0].uri;

    // Detect breed
    const prediction = await detectBreed(imageUri);

    console.log("Breed:", prediction.breed);
    console.log("Confidence:", prediction.confidence);
    console.log("All scores:", prediction.scores);
  }
}

Complete Component Example

import React, { useState } from "react";
import { View, Button, Text, Image, ActivityIndicator } from "react-native";
import { detectBreed } from "rn-breed-detector";
import { launchImageLibrary } from "react-native-image-picker";

export default function BreedDetector() {
  const [imageUri, setImageUri] = useState(null);
  const [result, setResult] = useState(null);
  const [loading, setLoading] = useState(false);

  const handlePickImage = async () => {
    const pickerResult = await launchImageLibrary({
      mediaType: "photo",
      quality: 1,
    });

    if (pickerResult.assets && pickerResult.assets[0]) {
      const uri = pickerResult.assets[0].uri;
      setImageUri(uri);

      // Run detection
      setLoading(true);
      try {
        const prediction = await detectBreed(uri);
        setResult(prediction);
      } catch (error) {
        console.error("Detection failed:", error);
      } finally {
        setLoading(false);
      }
    }
  };

  return (
    <View style={{ padding: 20 }}>
      <Button title="Pick Image" onPress={handlePickImage} />

      {imageUri && (
        <Image
          source={{ uri: imageUri }}
          style={{ width: 300, height: 300, marginTop: 20 }}
        />
      )}

      {loading && <ActivityIndicator size="large" />}

      {result && (
        <View style={{ marginTop: 20 }}>
          <Text style={{ fontSize: 18, fontWeight: "bold" }}>
            Breed: {result.breed}
          </Text>
          <Text>Confidence: {(result.confidence * 100).toFixed(2)}%</Text>
          
          <Text style={{ marginTop: 10, fontWeight: "bold" }}>All Predictions:</Text>
          {result.scores.slice(0, 3).map((item, idx) => (
            <Text key={idx}>
              {item.breed}: {(item.score * 100).toFixed(2)}%
            </Text>
          ))}
        </View>
      )}
    </View>
  );
}

API Reference

detectBreed(imageUri)

Performs breed detection on an image.

Parameters:

  • imageUri (string, required) — Local file URI (e.g., file:///path/to/image.jpg)

Returns: Promise<DetectionResult>

interface DetectionResult {
  breed: string;           // The predicted breed label
  confidence: number;      // Confidence score [0, 1] for the top prediction
  scores: Array<{          // All breeds sorted by score (descending)
    breed: string;
    score: number;         // Probability [0, 1]
  }>;
}

Example Output:

{
  "breed": "Sahiwal",
  "confidence": 0.89,
  "scores": [
    { "breed": "Sahiwal", "score": 0.89 },
    { "breed": "Gir", "score": 0.06 },
    { "breed": "Tharparkar", "score": 0.03 },
    ...
  ]
}

unloadModel()

Unloads the TFLite model from memory (optional cleanup).

import { unloadModel } from "rn-breed-detector";

// When you're done with predictions
unloadModel();

Supported Breeds

The default model recognizes these 10 cattle breeds:

  1. Sahiwal
  2. Gir
  3. Tharparkar
  4. Red Sindhi
  5. Rathi
  6. Ongole
  7. Kankrej
  8. Hariana
  9. Deoni
  10. Kangayam

You can customize the model and labels (see Customization below).


How It Works (Internals)

Model Bundling

The .tflite model file is bundled inside the NPM package at src/model/model.tflite.

When you call detectBreed(), the package:

  1. Loads the model using react-native-fast-tflite (cached after first load)
  2. Preprocesses the image — resizes to 300×300, converts to RGB Float32
  3. Runs inference on-device
  4. Applies softmax to convert logits to probabilities
  5. Returns results with breed labels and confidence scores

Why .tflite Asset Extension is Required

Metro bundler doesn't recognize .tflite files by default. The metro.config.js modification allows:

  • Bundling .tflite files as static assets
  • Using require() to reference the model path
  • Proper packaging for iOS/Android

Model Caching

The model is loaded once on the first call to detectBreed() and cached in memory. Subsequent calls reuse the loaded interpreter, providing fast inference times.

Preprocessing Pipeline (Production Implementation)

The preprocessing pipeline transforms camera images into tensors using a reliable, tested approach:

Step 1: Resize to 300×300

  • Uses expo-image-manipulator to resize and encode as JPEG
  • Returns Base64-encoded string

Step 2: Base64 to Binary

  • Decodes Base64 string using native atob()
  • Converts to Uint8Array

Step 3: JPEG Decoding

  • Uses jpeg-js library to decode JPEG binary
  • Produces RGBA pixel data (Uint8Array)

Step 4: RGB Extraction

  • Iterates through RGBA data with stride of 4
  • Extracts R, G, B channels (drops Alpha)
  • Creates Float32Array of size 300×300×3 = 270,000

Step 5: Tensor Format

  • Output: Float32Array with pixel values 0-255 (NOT normalized)
  • Shape: [1, 300, 300, 3]
  • Channel order: RGB (not BGR)
  • Memory layout: Row-major, flat buffer [R, G, B, R, G, B, ...]

CRITICAL: Pixel values are kept in the range 0-255, not normalized to [0, 1]. Ensure your model expects this input format.


Customization

Updating the Model

To use your own trained model:

  1. Train your model (TensorFlow/Keras, PyTorch → ONNX → TFLite, etc.)
  2. Convert to TensorFlow Lite format
  3. Replace src/model/model.tflite with your .tflite file
  4. Update src/labels.json with your class labels
  5. Rebuild the package

Model Requirements:

  • Input shape: [1, 300, 300, 3]
  • Input type: Float32
  • Input value range: 0-255 (NOT normalized to [0, 1])
  • Output shape: [1, num_classes] where num_classes matches labels.json length
  • Output type: Float32 (logits, softmax applied internally)

Updating Labels

Edit src/labels.json:

[
  "YourBreed1",
  "YourBreed2",
  "YourBreed3"
]

Important: The order must match your model's output classes.

Example: Converting a Keras Model to TFLite

import tensorflow as tf

# Load your trained model
model = tf.keras.models.load_model('cattle_classifier.h5')

# Convert to TensorFlow Lite
converter = tf.lite.TFLiteConverter.from_keras_model(model)

# Optional: Apply optimizations
converter.optimizations = [tf.lite.Optimize.DEFAULT]

# Convert
tflite_model = converter.convert()

# Save
with open('model.tflite', 'wb') as f:
    f.write(tflite_model)

Building & Publishing

Build the Package

npm run build

This transpiles src/ to dist/ using Babel and copies asset files.

Publish to NPM

npm publish --access public

Versioning the Model

When you update the model or labels:

  1. Increment the package version in package.json (e.g., 1.0.01.1.0)
  2. Rebuild: npm run build
  3. Publish: npm publish

Users can then upgrade to get the new model:

npm install rn-breed-detector@latest

Troubleshooting

Error: "Unable to resolve module './model.tflite'"

Cause: Metro config not set up correctly.

Solution:

  1. Ensure metro.config.js includes .tflite in assetExts
  2. Restart Metro bundler: npx react-native start --reset-cache

Error: "Cannot find module 'expo-image-manipulator'"

Cause: Missing peer dependency.

Solution:

npm install expo-image-manipulator

Model Not Loading on iOS

Cause: Pods not installed or linked.

Solution:

cd ios
pod install
cd ..
npx react-native run-ios

Error: "Image preprocessing failed"

Cause: Invalid image URI or unsupported image format.

Solution:

  • Ensure the URI is a valid local file path (starts with file://)
  • Use JPEG or PNG images
  • Check that the image file exists and is readable npx react-native run-ios

### Slow First Prediction

**Cause:** Model loading takes time on first call.

**Solution:** This is expected. The model is cached after the first load, so subsequent predictions are fast. You can preload the model on app startup:

```javascript
import { detectBreed } from "rn-breed-detector";

// Preload model on app start
useEffect(() => {
  detectBreed("dummy_uri").catch(() => {
    // Model is now cached
  });
}, []);

Performance

  • Model Size: ~4-10 MB (depending on architecture and quantization)
  • First Load: ~200-500ms (model initialization)
  • Subsequent Predictions: ~50-200ms (inference only)
  • Memory: ~20-50 MB (model in memory)

License

MIT


Contributing

Contributions welcome! Please open an issue or PR on GitHub.


Acknowledgments

  • Built with react-native-fast-tflite
  • Uses TensorFlow Lite for on-device inference
  • Designed for livestock management and agricultural applications

Roadmap

  • [ ] Add support for video inference
  • [ ] Optimize preprocessing with native modules
  • [ ] Support quantized models (Int8)
  • [ ] Add confidence thresholding
  • [ ] Multi-language label support
  • [ ] Model versioning API

Happy cattle detecting! 🐄