npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@garrix82/reactgenie-lib

v1.3.6

Published

A Toolkit for Multimodal Applications

Readme

ReactGenie

React, React Native, and Expo integration for multimodal apps built on @omkarfork/reactgenie-dsl

NPM JavaScript Style Guide

Fork notice: this package snapshot is maintained as a thesis-project fork by Omkar Mirgal. For the original ReactGenie project, see StanfordHCI/ReactGenie.

Introduction

@omkarfork/reactgenie-lib is the React integration layer for @omkarfork/reactgenie-dsl.

It gives you the pieces needed to turn Genie DSL classes into a working UI runtime:

  • decorator and base-class re-exports for Genie models
  • a Redux-backed runtime bootstrap with initReactGenie()
  • visible-view registration with GenieClassInterface(...)
  • multimodal execution through ModalityProvider
  • speech recognition support for web and native runtimes
  • Expo Router integration helpers
  • platform-aware voice UI, overlays, logging, and telemetry hooks

This README covers the current package surface, installation, quick start, and feature map. For deeper architecture, advanced routing details, and lower-level implementation notes, see DEVELOPER_GUIDE.md.

Feature Overview

ReactGenie currently includes:

  • React-oriented re-exports of the DSL decorators and base classes: GenieClass, GenieFunction, GenieProperty, GenieKey, DataClass, HelperClass, int, float
  • runtime bootstrap with initReactGenie()
  • shared store access with useGenieSelector() and useGenieCodeSelector()
  • UI-to-object registration with GenieClassInterface(...)
  • multimodal orchestration with ModalityProvider
  • contextual object resolution through injected Current() and AllCurrent()
  • speech recognition adapters for:
    • Groq on web
    • Groq on native via optional Expo audio/filesystem support
    • OpenAI realtime transcription on web
    • OpenAI realtime transcription on native via optional WebRTC support
    • browser Web Speech fallback for Groq flows
    • local MLX Whisper on web
  • voice UI primitives including VoiceRecognitionBar, AudioVisualizer, and AudioLevelIndicator
  • the useSpeechRecognition() hook for custom voice UI
  • Expo Router helpers: useExpoRouterAdapter(), withExpoRouterAdapter(), useRegisterGenieRoute(), registerGenieRoute()
  • platform abstraction exports for overlay, button, transcript, event, and safe-container helpers
  • runtime diagnostics with configureLogger(), setLogLevel(), setLoggerTelemetryBridge(), and voice-pipeline telemetry bridges
  • built-in helper classes DateTime and TimeDelta
  • typed user-facing errors with GenieUserFacingError
  • CLI binaries for prompt and parser experiments: get-prompt, dry-run, set-script, parse

Installation

Install the library and the required peer dependencies for any ReactGenie app:

npm install @omkarfork/reactgenie-lib @omkarfork/reactgenie-dsl react-redux redux reflect-metadata

If your app uses React Navigation directly, install the expected navigation peers:

npm install @react-navigation/native @react-navigation/native-stack @react-navigation/stack

Optional peers depend on which runtime paths you want to enable:

  • expo-router for file-based route integration
  • expo-audio and expo-file-system for native Groq speech capture
  • react-native-webrtc for native OpenAI realtime transcription
  • expo-speech if your native app speaks responses aloud

TypeScript and Babel setup

ReactGenie depends on decorator metadata. Enable the same pipeline used by ReactGenieDSL.

TypeScript:

{
  "compilerOptions": {
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true,
    "useDefineForClassFields": false
  }
}

Babel:

plugins: [
  ["@babel/plugin-proposal-decorators", { legacy: true }],
  ["@babel/plugin-transform-class-properties", { loose: true }],
  "babel-plugin-parameter-decorator",
  "babel-plugin-reactgenie"
]

Import reflect-metadata before any decorated Genie classes are evaluated.

Quick Start

The example below matches the current package API.

1. Define your Genie model

import "reflect-metadata";
import {
  DataClass,
  GenieClass,
  GenieFunction,
  GenieKey,
  GenieProperty,
  int,
} from "@omkarfork/reactgenie-lib";

@GenieClass("A counter")
export class Counter extends DataClass {
  @GenieKey
  @GenieProperty("Stable counter name")
  name: string;

  @GenieProperty("Current count")
  count: int;

  static Examples = [
    {
      user_utterance: "increment this counter",
      example_parsed: "Counter.Current().increment()",
    },
    {
      user_utterance: "what is the count",
      example_parsed: "Counter.Current().count",
    },
  ];

  constructor({ name, count = 0 }: { name: string; count?: int }) {
    super({ name });
    this.name = name;
    this.count = count;
  }

  static setup() {
    Counter.CreateObject({ name: "apples", count: 2 });
    Counter.CreateObject({ name: "oranges", count: 5 });
  }

  @GenieFunction("Increment the counter")
  increment(): this {
    this.count += 1;
    return this;
  }
}

2. Register a renderable view

import { Pressable, Text, View } from "react-native";
import {
  GenieClassInterface,
  genieDispatch,
  useGenieSelector,
} from "@omkarfork/reactgenie-lib";
import { Counter } from "./counter";

type CounterViewProps = {
  name: string;
};

function CounterView({ name }: CounterViewProps) {
  const counter = useGenieSelector(() => Counter.GetObject({ name }));
  const count = useGenieSelector(() => counter.count);

  return (
    <View>
      <Text>{name}</Text>
      <Text>{count}</Text>
      <Pressable onPress={() => genieDispatch(() => counter.increment())}>
        <Text>Increment</Text>
      </Pressable>
    </View>
  );
}

export const CounterCard = GenieClassInterface(
  "Counter",
  ({ name }: CounterViewProps) => `${name} counter`,
  1
)(CounterView);

GenieClassInterface(...) currently takes:

GenieClassInterface(
  type: string,
  displayTitle?: string | ((props: any) => string),
  displayPriority?: number | ((props: any) => number)
)

Important: the registered view class name comes from the wrapped component function name (CounterView above), not from the displayTitle argument.

3. Initialize the shared runtime

import { initReactGenie } from "@omkarfork/reactgenie-lib";
import { Counter } from "./counter";

export const reactGenieStore = initReactGenie();

Counter.setup();

Call setup() yourself after initReactGenie(). The README examples in older docs implied that setup was automatic; the current implementation expects explicit initialization order.

4. Mount the provider

import { Provider } from "react-redux";
import {
  ModalityProvider,
  type SpeechRecognitionConfig,
} from "@omkarfork/reactgenie-lib";
import type { NlInterpreterParserConfig } from "@omkarfork/reactgenie-dsl";
import { reactGenieStore } from "./store";
import { CounterCard } from "./CounterCard";

const parserConfig: NlInterpreterParserConfig = {
  provider: "groq",
};

const speechConfig: SpeechRecognitionConfig = {
  provider: "groq",
  groq: {
    apiBaseUrl: "http://localhost:4000",
  },
};

export function App() {
  return (
    <Provider store={reactGenieStore}>
      <ModalityProvider
        clientSecret="proxy-or-provider-token"
        apiBaseUrl="http://localhost:4000"
        parserConfig={parserConfig}
        speechConfig={speechConfig}
        displayTranscript={true}
      >
        <CounterCard name="apples" />
      </ModalityProvider>
    </Provider>
  );
}

Core Concepts

@GenieClass

Marks a class as part of the Genie model surface and contributes prompt metadata.

@GenieClass("A counter")
export class Counter extends DataClass {}

DataClass

Use DataClass for persisted Genie objects with stable keys. These objects can be created with CreateObject(...), retrieved with GetObject(...), enumerated with All(), and targeted through UI-aware helpers such as Current() and AllCurrent() after ReactGenie initialization.

HelperClass

Use HelperClass for structured helper values nested inside data objects. Helper classes are serialized through parent data objects and are not standalone persisted records.

@GenieKey

Marks the key field for a DataClass. Each DataClass needs a Genie key.

@GenieKey
@GenieProperty()
name: string;

@GenieProperty

Marks a field as part of the serializable Genie state and prompt-visible model.

@GenieProperty("Current count")
count: int;

@GenieFunction

Marks a static or instance method as callable from Genie-generated DSL / multimodal flows.

@GenieFunction("Increment the counter")
increment(): this {
  this.count += 1;
  return this;
}

CreateObject(...)

Create Genie objects through CreateObject(...), not new.

Counter.CreateObject({ name: "apples", count: 2 });

GetObject(...)

Fetch a DataClass instance by its key.

const apples = Counter.GetObject({ name: "apples" });

Current() and AllCurrent()

After initReactGenie(), ReactGenie injects:

  • Current(): the single visible object the user is most likely referring to
  • AllCurrent(): all visible objects of that class, or the nearest matches to multiple taps

These helpers are driven by visible registered interfaces, so they only work when the corresponding UI is wrapped with GenieClassInterface(...).

GenieClassInterface(...)

Wrap any component that should be:

  • targetable by voice results
  • visible to Current() / AllCurrent()
  • eligible for result-driven navigation

ModalityProvider

ModalityProvider is the main runtime entry point. It creates the shared interpreter, merges examples, listens for speech, parses commands, executes DSL, emits user-facing responses, and routes displayable results into registered views.

Key props in the current implementation:

  • examples: extra prompt examples to merge with class-level static Examples
  • clientSecret: required credential or client token used by parser / speech providers
  • apiBaseUrl: optional shared proxy base URL
  • parserConfig: ReactGenieDSL parser configuration
  • speechConfig: speech runtime configuration
  • speechLanguage: recognition language tag
  • displayTranscript: show transcript text in the default voice UI
  • extraPrompt: append extra instructions to the parser prompt
  • VoiceUIComponent: replace the default VoiceRecognitionBar
  • showFloatingMicButton: show or hide the default floating mic button
  • voicePlaceholder: custom placeholder text for the default voice UI
  • loggerConfig: override runtime logging configuration
  • langsmithConfig: optional monitor configuration object with apiKey, project, and endpoint

If you are migrating from older examples, note that the current prop name is clientSecret, not codexApiKey.

Speech Recognition

ReactGenie ships a unified speech layer and individual adapters.

Supported speech paths

| Path | Runtime | Export | Notes | | --- | --- | --- | --- | | Groq web transcription | Web | GroqSpeechRecognizer | Default provider path on web when configured. | | Groq native transcription | iOS / Android | selected through UnifiedSpeechRecognizer | Requires optional expo-audio and expo-file-system. | | OpenAI realtime transcription | Web | OpenAISpeechRecognizer | WebRTC-based realtime transcription. | | OpenAI realtime transcription | iOS / Android | selected through UnifiedSpeechRecognizer | Requires optional react-native-webrtc. | | Browser Web Speech fallback | Web | SpeechRecognizer | Used as a Groq fallback when enabled. | | Local MLX Whisper | Web | MLXSpeechRecognizer | Useful for local speech pipelines. |

Unified speech helpers

The main speech exports are:

  • UnifiedSpeechRecognizer
  • getSpeechRecognitionConfig()
  • selectSpeechRecognitionMethod()
  • getSpeechRecognitionStatus()
  • useSpeechRecognition()
  • VoiceRecognitionBar

SpeechRecognitionConfig

The current config shape is:

type SpeechRecognitionConfig = {
  provider?: "groq" | "openai";
  fallbackToWebSpeech?: boolean;
  groq?: {
    enabled?: boolean;
    apiKey?: string;
    apiBaseUrl?: string;
    transcriptionEndpoint?: string;
    stream?: boolean;
  };
  openai?: {
    proxyBaseUrl?: string;
    transcriptionSessionEndpoint?: string;
    translationEndpoint?: string;
    realtimeWebRtcUrl?: string;
    realtimeModel?: "gpt-4o-transcribe" | "gpt-4o-mini-transcribe";
    translationModel?: string;
    vadEnabled?: boolean;
    autoStopOnVadSilence?: boolean;
    vadThreshold?: number;
    vadPrefixPaddingMs?: number;
    vadSilenceDurationMs?: number;
    vadIdleTimeoutMs?: number;
    prewarmOnMount?: boolean;
  };
  localModel?: {
    enabledOnWeb?: boolean;
    baseUrl?: string;
    mode?: "oneshot" | "sse" | "realtime";
    model?: string;
  };
};

Defaults in the current implementation include:

  • speech provider defaults to groq
  • Groq defaults to streaming mode
  • Groq defaults to fallbackToWebSpeech: true
  • OpenAI defaults to realtimeModel: "gpt-4o-transcribe"
  • OpenAI defaults to prewarmOnMount: true
  • local MLX defaults to disabled on web unless explicitly enabled

Routing and Result Display

ReactGenie can turn displayable execution results into navigation events when the target object type has a registered interface.

Expo Router support is available through:

  • useExpoRouterAdapter()
  • withExpoRouterAdapter()
  • useRegisterGenieRoute()
  • registerGenieRoute()
  • isExpoRouterAvailable()
  • getRegisteredRoutes()
  • getRouteMap()

Example:

import { useRegisterGenieRoute } from "@omkarfork/reactgenie-lib";

export default function CounterRoute() {
  useRegisterGenieRoute("CounterView");
  return <CounterCard name="apples" />;
}

Notes:

  • the route key must match the wrapped component function name, not necessarily the exported variable name
  • duplicate route registration for the same view class and different paths throws
  • dynamic route params are filtered so Genie metadata does not become arbitrary query params
  • result-window and query-intent metadata are stored in shared state for downstream reconstruction

Public API At a Glance

Runtime and store

  • ModalityProvider
  • GenieInterpreter
  • initReactGenie
  • useGenieSelector
  • useGenieCodeSelector
  • resetGenieNavigation
  • genieDispatch
  • sharedStore
  • sharedState

Genie model and decorators

  • GenieClass
  • GenieClassInterface
  • GenieFunction
  • GenieProperty
  • GenieKey
  • DataClass
  • HelperClass
  • AllGenieObjects
  • ClassDescriptor
  • FieldDescriptor
  • FuncDescriptor
  • ParamDescriptor
  • int
  • float
  • DateTime
  • TimeDelta

Voice and speech

  • useSpeechRecognition
  • SpeechState
  • VoiceRecognitionBar
  • SpeechRecognizer
  • GroqSpeechRecognizer
  • OpenAISpeechRecognizer
  • MLXSpeechRecognizer
  • UnifiedSpeechRecognizer
  • isGroqSpeechAvailable
  • getGroqSpeechCapabilities
  • isOpenAISpeechAvailable
  • getOpenAISpeechCapabilities
  • getSpeechRecognitionConfig
  • selectSpeechRecognitionMethod
  • getSpeechRecognitionStatus
  • streamGroqTranscription
  • mergeTranscriptParts
  • AudioVisualizer
  • AudioLevelIndicator

Routing and platform support

  • useExpoRouterAdapter
  • withExpoRouterAdapter
  • useRegisterGenieRoute
  • registerGenieRoute
  • isExpoRouterAvailable
  • getRegisteredRoutes
  • getRouteMap
  • platform-aware exports from ./platform, including overlay, button, transcript, loading, safe-container, event, and type helpers

Diagnostics and errors

  • configureLogger
  • setLogLevel
  • logger
  • setLoggerTelemetryBridge
  • setVoicePipelineTelemetryBridge
  • getVoicePipelineTelemetryBridge
  • GenieUserFacingError
  • isGenieUserFacingError

Types and compatibility exports

  • ReactGenieState
  • NavigatorState
  • QueryIntent
  • QueryIntentOperation
  • ResultWindowEntry
  • SpeechRecognitionState
  • SpeechRecognitionControls
  • UseSpeechRecognitionReturn
  • VoiceRecognitionBarProps
  • SpeechRecognitionConfig
  • UnifiedSpeechRecognizerProps
  • GroqConfigProps
  • OpenAIConfigProps
  • LocalModelConfigProps
  • LoggerRuntimeConfig
  • LogLevel
  • LoggerTelemetryBridge
  • VoicePipelineContext
  • VoicePipelineHandle
  • VoicePipelineOutcome
  • VoicePipelineStage
  • VoicePipelineStageHandle
  • VoicePipelineStageStatus
  • VoicePipelineTelemetryBridge
  • RouteRegistrationInfo
  • NativeSpeechRecognizerProps and OpenAINativeSpeechRecognizerProps as type-only exports
  • ReactFromModule
  • ReactReduxFromModule

Custom Voice UI

If you do not want the default floating mic and voice bar, you can:

  • pass showFloatingMicButton={false}
  • supply your own VoiceUIComponent
  • build a fully custom interface with useSpeechRecognition()

The default UI components already understand transcript display, listening state, processing state, and waveform preview.

Logging and Telemetry

ReactGenie includes runtime logging and voice-pipeline instrumentation hooks:

  • configureLogger(...) to enable or tune runtime sinks
  • setLogLevel(...) for environment-specific noise control
  • setLoggerTelemetryBridge(...) to forward logger events
  • setVoicePipelineTelemetryBridge(...) to observe mic / parse / execute pipeline stages

These hooks are useful when you want app-level observability without forking the runtime.

CLI Utilities

The published package exposes four binaries:

  • get-prompt
  • dry-run
  • set-script
  • parse

These are oriented toward local prompt-generation and parser experiments around Genie test harnesses.

Common Notes

  • Create Genie objects with CreateObject(...), not new.
  • Import all Genie classes and registered interfaces before calling initReactGenie().
  • Wrap visible targetable components with GenieClassInterface(...); otherwise Current() / AllCurrent() cannot resolve them.
  • Register routes when using Expo Router if you expect displayable results to navigate.
  • If native speech does not work, check whether the corresponding optional peer dependencies are installed for your selected provider path.

License

Apache-2.0

This package is a thesis-project fork maintained by Omkar Mirgal. Based on the original ReactGenie project by StanfordHCI. Original upstream attribution includes Jackie Yang. See the LICENSE file for the full license text.