npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@ismail-kattakath/mediapipe-react

v0.2.0

Published

<div align="center">

Readme

@ismail-kattakath/mediapipe-react

NPM Version License Build Status

Production-ready React hooks for MediaPipe AI tasks — GenAI, Vision, and Audio.


Features

  • 🧊 React-first API: Clean, hooks-based interface
  • 🚀 Next.js Optimized: Built-in SSR safety and App Router support
  • 📦 Tree-shakable Subpaths: Only bundle what you use (e.g., /genai)
  • 🛠️ Fully Typed: Written in TypeScript for excellent DX
  • Web Worker Support: Heavy inference runs in background threads

Installation

pnpm add @ismail-kattakath/mediapipe-react
# or
npm install @ismail-kattakath/mediapipe-react
# or
yarn add @ismail-kattakath/mediapipe-react

[!NOTE] You'll also need to install the specific MediaPipe task package for the features you use:

pnpm add @mediapipe/tasks-genai  # For GenAI features

Quick Start

Vite / Vanilla React

// main.tsx
import { StrictMode } from "react";
import { createRoot } from "react-dom/client";
import { MediaPipeProvider } from "@ismail-kattakath/mediapipe-react";
import App from "./App";

createRoot(document.getElementById("root")!).render(
  <StrictMode>
    <MediaPipeProvider>
      <App />
    </MediaPipeProvider>
  </StrictMode>,
);
// App.tsx
import { useLlm } from "@ismail-kattakath/mediapipe-react/genai";

export default function App() {
  const { generate, output, isLoading } = useLlm({
    modelPath: "/models/gemma-2b-it-gpu-int4.bin",
  });

  return (
    <div>
      <button
        onClick={() => generate("Explain React hooks")}
        disabled={isLoading}
      >
        Generate
      </button>
      <pre>{output}</pre>
    </div>
  );
}

Next.js (App Router)

// app/layout.tsx
import { MediaPipeProvider } from "@ismail-kattakath/mediapipe-react";

export default function RootLayout({
  children,
}: {
  children: React.ReactNode;
}) {
  return (
    <html lang="en">
      <body>
        <MediaPipeProvider>{children}</MediaPipeProvider>
      </body>
    </html>
  );
}
// app/components/ChatBox.tsx
"use client";

import { useLlm } from "@ismail-kattakath/mediapipe-react/genai";

export default function ChatBox() {
  const { generate, output, isLoading } = useLlm({
    modelPath: "/models/gemma-2b-it-gpu-int4.bin",
  });

  return (
    <div>
      <button onClick={() => generate("Hello!")} disabled={isLoading}>
        Send
      </button>
      <p>{output}</p>
    </div>
  );
}

Subpath Strategy

This library uses subpath exports to keep your bundle size minimal. Import only the features you need:

| Subpath | Purpose | Example | | ------------------------------------------ | ----------------------------- | ------------------------------------------------------------------------------ | | @ismail-kattakath/mediapipe-react | Core provider and utilities | import { MediaPipeProvider } from "@ismail-kattakath/mediapipe-react" | | @ismail-kattakath/mediapipe-react/genai | LLM inference and GenAI hooks | import { useLlm } from "@ismail-kattakath/mediapipe-react/genai" | | @ismail-kattakath/mediapipe-react/vision | Vision tasks (planned) | import { useHandTracking } from "@ismail-kattakath/mediapipe-react/vision" | | @ismail-kattakath/mediapipe-react/audio | Audio tasks (planned) | import { useAudioClassifier } from "@ismail-kattakath/mediapipe-react/audio" |

[!TIP] Using subpaths ensures that importing /genai won't bundle vision or audio code, reducing your final bundle size.

API Reference

Core

MediaPipeProvider

The root provider that supplies configuration to all MediaPipe hooks.

Props:

| Prop | Type | Default | Description | | ----------- | ------------------- | ----------- | ----------------------------------- | | wasmPath | string (optional) | undefined | Custom path to MediaPipe WASM files | | modelPath | string (optional) | undefined | Default model path for all hooks | | children | ReactNode | — | Your app components |

Example:

<MediaPipeProvider wasmPath="/wasm" modelPath="/models/default.bin">
  <App />
</MediaPipeProvider>

useMediaPipeContext

Access the MediaPipe context from any child component.

Returns:

{
  wasmPath?: string;
  modelPath?: string;
}

Example:

import { useMediaPipeContext } from "@ismail-kattakath/mediapipe-react";

function MyComponent() {
  const { modelPath } = useMediaPipeContext();
  return <div>Model path: {modelPath}</div>;
}

GenAI

useLlm

Hook for LLM inference with Web Worker orchestration.

Parameters:

interface UseLlmOptions {
  modelPath: string; // Path to .bin or .task model file
  maxTokens?: number; // Max tokens to generate (default: 512)
  temperature?: number; // Sampling temperature (default: 0.8)
  topK?: number; // Top-K sampling (default: 40)
  randomSeed?: number; // Random seed for reproducibility
}

Returns:

{
  generate: (prompt: string) => void;
  output: string;              // Generated text
  isLoading: boolean;          // Whether inference is running
  progress: number;            // Progress percentage (0-100)
  error: string | null;        // Error message if any
}

Example:

import { useLlm } from "@ismail-kattakath/mediapipe-react/genai";

function ChatInterface() {
  const { generate, output, isLoading, error } = useLlm({
    modelPath: "/models/gemma-2b-it-gpu-int4.bin",
    maxTokens: 1024,
    temperature: 0.7,
  });

  const handleSubmit = (prompt: string) => {
    generate(prompt);
  };

  if (error) return <div>Error: {error}</div>;

  return (
    <div>
      <textarea onChange={(e) => handleSubmit(e.target.value)} />
      {isLoading && <p>Generating...</p>}
      <pre>{output}</pre>
    </div>
  );
}

Vision

useFaceLandmarker

Hook for real-time face landmark detection.

Parameters:

interface UseFaceLandmarkerOptions {
  modelPath?: string; // Path to face_landmarker.task
  wasmPath?: string; // Path to wasm files
}

Returns:

{
  detect: (input: HTMLVideoElement | HTMLCanvasElement | ImageData, timestamp: number) => void;
  results: { faceLandmarks: { x: number; y: number; z: number }[][] } | null;
  isLoading: boolean;
  error: string | null;
}

Example:

import {
  useFaceLandmarker,
  drawLandmarks,
} from "@ismail-kattakath/mediapipe-react/vision";
import { useEffect, useRef } from "react";

function FaceTracker() {
  const videoRef = useRef<HTMLVideoElement>(null);
  const canvasRef = useRef<HTMLCanvasElement>(null);
  const { detect, results, isLoading } = useFaceLandmarker();

  const processFrame = () => {
    if (videoRef.current && canvasRef.current) {
      const now = performance.now();
      detect(videoRef.current, now);

      const ctx = canvasRef.current.getContext("2d");
      if (ctx) {
        ctx.clearRect(0, 0, canvasRef.current.width, canvasRef.current.height);
        drawLandmarks(canvasRef.current, results);
      }

      requestAnimationFrame(processFrame);
    }
  };

  return (
    <>
      <video ref={videoRef} autoPlay playsInline />
      <canvas ref={canvasRef} />
    </>
  );
}

drawLandmarks

Utility to draw face landmarks on a canvas.

drawLandmarks(canvas: HTMLCanvasElement, results: unknown): void

Next.js / SSR Compatibility

This library is fully compatible with Next.js App Router and Server-Side Rendering.

How It Works

All hooks include an isBrowser() guard that prevents MediaPipe initialization during server-side rendering:

// Internal implementation
function isBrowser(): boolean {
  return typeof window !== "undefined";
}

This means:

  • ✅ No "window is not defined" errors
  • ✅ No hydration mismatches
  • ✅ Works with React Server Components (when used in Client Components)

Best Practices

  1. Always use "use client" directive when using MediaPipe hooks:
"use client";

import { useLlm } from "@ismail-kattakath/mediapipe-react/genai";

export default function MyComponent() {
  // Your code here
}
  1. Wrap your app with MediaPipeProvider in a Client Component:
// app/providers.tsx
"use client";

import { MediaPipeProvider } from "@ismail-kattakath/mediapipe-react";

export function Providers({ children }: { children: React.ReactNode }) {
  return <MediaPipeProvider>{children}</MediaPipeProvider>;
}
// app/layout.tsx
import { Providers } from "./providers";

export default function RootLayout({
  children,
}: {
  children: React.ReactNode;
}) {
  return (
    <html>
      <body>
        <Providers>{children}</Providers>
      </body>
    </html>
  );
}
  1. Serve model files from the public/ directory:
public/
└── models/
    ├── gemma-2b-it-gpu-int4.bin
    └── llama-3-8b.task

Then reference them with absolute paths:

const { generate } = useLlm({
  modelPath: "/models/gemma-2b-it-gpu-int4.bin",
});

Advanced Usage

Custom WASM Path

If you're hosting MediaPipe WASM files on a CDN:

<MediaPipeProvider wasmPath="https://cdn.example.com/mediapipe/wasm">
  <App />
</MediaPipeProvider>

Error Handling

const { generate, error } = useLlm({ modelPath: "/models/model.bin" });

useEffect(() => {
  if (error) {
    console.error("LLM Error:", error);
    // Show user-friendly error message
  }
}, [error]);

Progress Tracking

const { generate, progress, isLoading } = useLlm({
  modelPath: "/models/model.bin",
});

return <div>{isLoading && <progress value={progress} max={100} />}</div>;

Troubleshooting

"Failed to load model"

  • Ensure the model file exists at the specified path
  • Check that the file is served with correct MIME type
  • Verify the model format is compatible (.bin or .task)

"Worker initialization failed"

  • Ensure your bundler supports Web Workers
  • For Vite, no additional config needed
  • For Next.js, ensure you're using Next.js 13+ with App Router

Bundle size is too large

  • Use subpath imports: @ismail-kattakath/mediapipe-react/genai instead of the root import
  • Ensure tree-shaking is enabled in your bundler
  • Check that you're not importing unused subpaths

TypeScript Support

This library is written in TypeScript and includes full type definitions. No additional @types packages needed.

import type { UseLlmOptions } from "@ismail-kattakath/mediapipe-react/genai";

const config: UseLlmOptions = {
  modelPath: "/models/model.bin",
  maxTokens: 512,
};

Contributing

This package is part of a monorepo. For contribution guidelines, see the main repository README.

License

MIT © Ismail Kattakath