npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@subhajit-gorai/react-native-mediapipe-llm

v1.0.3

Published

React Native binding for Google AI Edge Gallery's MediaPipe on-device LLM inference engine

Downloads

19

Readme

React Native MediaPipe LLM Demo

This is a demonstration app showcasing the integration of Google's Gemma 3N model with React Native using the MediaPipe LLM framework.

Features

  • 📱 Cross-platform chat interface (iOS/Android)
  • 🤖 Gemma 3N integration with streaming responses
  • 📂 File picker for model selection
  • 💬 Real-time chat with AI assistant
  • 🎨 Modern, responsive UI design

Getting Started

Prerequisites

  1. Download the Gemma 3N Model

  2. Development Environment

    • Node.js 16+
    • React Native development environment
    • iOS Simulator or Android Emulator
    • Expo CLI (if using Expo Go)

Installation

  1. Install dependencies

    npm install
  2. iOS Setup (if running on iOS)

    cd ios && pod install && cd ..

Running the Demo

Using Expo Go

npm start

Then scan the QR code with Expo Go app.

iOS Simulator

npm run ios

Android Emulator

npm run android

How to Use

  1. Launch the App

    • The app will start with the Welcome screen
  2. Select Model File

    • Tap "Select Model File"
    • Navigate to where you saved the gemma-3n-E2B-it-int4.task file
    • Select the file
  3. Initialize Model

    • Tap "Initialize Model"
    • Wait for initialization to complete (may take a few moments)
  4. Start Chatting

    • Tap "Start Chatting →"
    • Begin conversing with Gemma 3N!

App Structure

src/
├── hooks/
│   └── useLlmInference.ts    # LLM integration hook
├── screens/
│   ├── WelcomeScreen.tsx     # Model setup and initialization
│   └── ChatScreen.tsx        # Chat interface
├── types/
│   └── index.ts              # TypeScript definitions
└── components/               # Reusable UI components

Key Components

WelcomeScreen

  • Model file selection via document picker
  • Model initialization with configuration
  • Status tracking and user guidance

ChatScreen

  • Real-time chat interface
  • Streaming response display
  • Message history management
  • Keyboard handling

useLlmInference Hook

  • Wraps MediaPipe LLM functionality
  • Handles model initialization
  • Manages response generation
  • Provides loading states

Configuration

The demo uses these default LLM parameters:

  • Max Tokens: 512
  • Temperature: 0.8
  • Top-K: 40
  • Top-P: 0.9

These can be modified in WelcomeScreen.tsx.

Troubleshooting

Model Not Loading

  • Ensure you downloaded the correct .task file
  • Check file permissions
  • Verify sufficient device storage

App Crashes

  • Restart the development server
  • Clear React Native cache: npx react-native start --reset-cache
  • Reinstall dependencies

Performance Issues

  • Close other apps to free memory
  • Use a physical device for better performance
  • Ensure the model file isn't corrupted

Technical Notes

  • Model Size: The Gemma 3N model is approximately 2GB
  • Memory Usage: Requires ~4GB RAM for optimal performance
  • Inference Speed: Varies by device hardware capabilities
  • Storage: Ensure 3GB+ free space for model and cache

Next Steps

This demo provides a foundation for:

  • Production chat applications
  • Custom model integrations
  • Advanced LLM features
  • Performance optimizations

Support

For issues specific to this demo, please check the main project README or create an issue in the repository.


Note: This is a demonstration app. For production use, implement proper error handling, security measures, and performance optimizations.