webllm-ai-provider
v0.0.1
Published
WebLLM provider for Vercel AI SDK (V2 specification)
Maintainers
Readme
webllm-ai-provider
Vercel AI SDK provider for WebLLM - Let users run language models.
What is this?
This package provides a Vercel AI SDK provider that uses WebLLM to run LLMs requests in the browser and user preffered cloud APIs.
Installation
npm install webllm-ai-provider webllm aiBasic Usage
import { webllm } from 'webllm-ai-provider';
import { generateText } from 'ai';
// Simple usage - WebLLM handles everything
const result = await generateText({
model: webllm(),
prompt: 'Explain quantum computing',
});
console.log(result.text);With Preferences
import { webllm } from 'webllm-ai-provider';
import { generateText } from 'ai';
// Optional preferences for model selection
const result = await generateText({
model: webllm({
task: 'coding',
hints: {
speed: 'fast',
quality: 'high',
},
}),
prompt: 'Write a React component',
});Streaming
import { streamText } from 'ai';
const { textStream } = await streamText({
model: webllm(),
prompt: 'Write a story about AI',
});
for await (const chunk of textStream) {
process.stdout.write(chunk);
}React/Next.js Example
// app/api/chat/route.ts
import { streamText } from 'ai';
import { webllm } from 'webllm-ai-provider';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: webllm({
task: 'general',
hints: { speed: 'balanced' },
}),
messages,
});
return result.toDataStreamResponse();
}Optional Preferences (subject to change)
Task Types
'general'- General conversation'coding'- Code generation'creative'- Creative writing'qa'- Question answering'summarization'- Text summarization'translation'- Language translation
Hints
{
speed: 'fastest' | 'fast' | 'balanced' | 'quality',
quality: 'draft' | 'standard' | 'high' | 'best'
}Key Features
✅ Browser-native - Runs entirely in the browser
✅ Privacy-first - Data never leaves the user's device
✅ No API costs - Models run locally on user hardware
✅ Intelligent selection - Automatically picks the best available model
✅ Streaming support - Real-time response generation
Important Notes
- Browser-only: This provider only works in browser environments, not in Node.js
- WebLLM Extension: Users need to install the WebLLM browser extension
- Automatic model selection: WebLLM handles model selection based on your task and preferences
API Reference
webllm(preferences?)
Creates a Vercel AI SDK compatible model instance.
