react-native-nai
v0.1.2
Published
react-native-nai
Readme
react-native-nai
On-device AI capabilities for React Native using Apple's native LLM APIs, built with Nitro Modules.
Note: This package is inspired by @react-native-ai/apple.
Features
- 🚀 On-Device AI: Run Apple's LLM models directly on iOS devices
- ⚡ High Performance: Built with Nitro Modules for native speed
- 🔄 Streaming Support: Real-time text generation with streaming
- 🛠️ Tool Calling: Support for function calling and tools
- 🎯 AI SDK Integration: Compatible with Vercel AI SDK
- ⚛️ React Hooks: Easy-to-use hooks for chat functionality
Installation
npm install react-native-nai react-native-nitro-modules
# or
yarn add react-native-nai react-native-nitro-modules
# or
bun add react-native-nai react-native-nitro-modulesiOS Setup
cd ios && pod installUsage
Basic Usage
import { generateText } from 'ai';
import {AppleLLM} from "react-native-nai";
import { createAppleLLM } from "react-native-nai/ai-sdk";
const model = createAppleLLM();
// Check availability
if (!AppleLLM.isAvailable()) {
Alert.alert("Apple LLM is not available on this device");
return;
}
// Generate text
const result = await generateText({
model: model(),
prompt: "Tell me a joke"
})
// Generate text with streaming
const { textStream } = await streamText({
model: model(),
prompt: "Tell me a joke"
});
for await (const textPart of textStream) {
console.log(textPart)
}Chat Usage
import { useState } from "react";
import { Pressable, Text, TextInput, ScrollView, View } from "react-native";
import { generateId, type UIMessage } from "ai";
import { createAppleLLM, Provider, useLocalChat } from "react-native-nai/ai-sdk";
const model = createAppleLLM();
function Chat() {
return (
<Provider initialMessages={[]}>
<ChatInterface />
</Provider>
);
}
function ChatInterface() {
useLocalChat(model(), {
messages: [],
prepareMessages: (msgs) => [
{
id: generateId(),
role: "system",
parts: [
{
type: "text",
text: `You are Nai, an AI assistant integrated into a mobile app. Now is ${new Date().toISOString()}.`,
},
],
},
// simple fetch previous 10 messages as context
...msgs.slice(-10),
],
onError: (error) => {
console.log(">> error is", error.message);
},
onFinish: ({ message }) => {
console.log("finished", message);
},
onToolCall: ({ toolCall }) => {
console.log(toolCall);
},
});
return (
<View className="flex-1">
<ChatMessages />
<ChatInput />
</View>
);
}
function ChatMessages() {
const { messages } = useLocalChat();
// can use FlatList or other performant list components for better performance
return (
<ScrollView>
{messages.map((msg) => (
<Text key={msg.id}>{msg.parts.map((part) => part.text).join("")}</Text>
))}
</ScrollView>
);
}
function ChatInput() {
const status = useChatStatus();
const { sendMessage } = useChatActions();
const [input, setInput] = useState("");
const onSend = async () => {
const id = generateId();
const uiMessage: UIMessage = {
id,
role: "user",
parts: [{ type: "text", text: input }],
};
setInput("");
await sendMessage(uiMessage);
};
return (
<View className="flex-row items-center gap-2 bg-background p-3">
<TextInput
className="flex-1"
value={input}
onChangeText={setInput}
placeholder="Type a message"
onSubmitEditing={onSend}
/>
<Pressable
className="h-6 items-center justify-center rounded-full bg-accent px-2 disabled:opacity-50"
disabled={status === "streaming"}
onPress={onSend}
>
<Text className="text-accent-foreground">Send</Text>
</Pressable>
</View>
);
}Tool Calling
Important Apple-Specific Behavior
Tools need to be registered first to Apple LLM since it's provided by the OS not AI SDK. Once it's registered, you can pick a subset of tools to use in useLocalChat, generateText, etc.
import { generateText, type ToolSet, tool, type UIMessage } from "ai";
import { createAppleLLM } from "react-native-nai/ai-sdk";
import z from "zod";
const availableTools: ToolSet = {
weather: tool({
description: "Get current weather information for a city.",
inputSchema: z.object({
city: z.string(),
}),
execute: async ({ city }) =>
`The current weather in ${city} is sunny with a temperature of 25°C.`,
}),
};
const model = createAppleLLM({ availableTools });
const result = await generateText({
model: model(),
prompt: "How's the weather in Taipei?",
tools: {
weather: availableTools.weather,
}
});License
MIT © Ethan Lin
Acknowledgments
This package is inspired by react-native-ai, bringing on-device AI capabilities to React Native applications via NitroModule.
