npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@axols/webai-js

v1.0.0

Published

A free, open source library to embed powerful AI models directly into your web applications. Run AI in the front-end with complete data privacy and zero hosting costs. Compatible with major front-end frameworks like React, Next.js, Vue, etc.

Readme

@axols/webai-js

npm version License

Run AI models directly in your users' browsers with zero server-side infrastructure.

📖 Documentation | 🎮 Playground | 🤖 Models Hub | 👨🏻‍💻 Discord Community

test

🚀 Overview

Axols WebAI.js is an open-source library that enables client-side AI inference directly in the browser. Built on top of Transformers.js, WebGPU, and ONNX Runtime, it eliminates the need for server-side AI model hosting and inference infrastructure.

Key Features

  • 🏪 ONNX Model Hub: Access a curated pool of browser-optimized ONNX AI models
  • 🌐 Pure Client-Side: Run AI models entirely in the browser
  • 🔒 Privacy-First: Data never leaves the user's device
  • 📦 Zero Backend Costs: No server infrastructure needed
  • 🚀 Easy Setup: No headaches with packages - just one simple installation
  • 🎯 Standardized API: Same interface across all models
  • 🔄 Streaming Support: Real-time generation with streaming
  • 🛠️ Framework Compatible: Works with React, Vue, Angular, Next.js, and more

📦 Installation

npm install @axols/webai-js

🎯 Quick Start

All our Web AI models are standardized to use the same 3-step API interface:

import { WebAI } from '@axols/webai-js';

// Step 1: Create a WebAI instance
const webai = await WebAI.create({
  modelId: "llama-3.2-1b-instruct"
});

// Step 2: Initialize (downloads and loads the model)
await webai.init({
  mode: "auto", // Automatically selects best configuration
  onDownloadProgress: (progress) => {
    console.log(`Download progress: ${progress.progress}%`);
  }
});

// Step 3: Generate
const result = await webai.generate({
  userInput: {
     messages: [
      {
        role: "user",
        content: "What is the history of AI?"
      },
    ],   
  }
});

console.log(result);

// Step 4: Clean up when done
webai.terminate();

📖 Core Concepts

Model Lifecycle

  1. Create: Instantiate a WebAI object with a model ID
  2. Initialize: Download (if needed) and load the model into memory
  3. Generate: Run inference on user input
  4. Terminate: Clean up resources when finished

Auto Mode

Let WebAI automatically determine the best configuration based on device capabilities:

await webai.init({
  mode: "auto",
  onDownloadProgress: (progress) => console.log(progress)
});

Custom Priorities

Control fallback behavior with custom priority configurations:

await webai.init({
  mode: "auto",
  priorities: [
    { mode: "webai", precision: "q4", device: "webgpu" },
    { mode: "webai", precision: "q8", device: "webgpu" },
    { mode: "webai", precision: "q4", device: "wasm" },
    { mode: "cloud", precision: "", device: "" }
  ]
});

🔄 Streaming Generation

For models that support streaming, provide real-time results:

const generation = await webai.generateStream({
  userInput: "Tell me a story",
  onStream: (chunk) => {
    console.log(chunk); // Process each chunk as it arrives
  }
});

📚 For detailed API reference and usage examples, see the model-specific documentation

💡 Best Practices

  • ✅ Always wrap WebAI code in try/catch blocks
  • ✅ Implement progress indicators during downloads
  • ✅ Terminate instances when no longer needed
  • ✅ Monitor device storage and memory usage
  • ✅ Use streaming for better UX with long generations
  • ✅ Test on target devices for performance validation

🌍 Browser Compatibility

WebAI.js works in all modern browsers that support:

  • WebAssembly
  • Web Workers
  • WebGPU (recommended for best performance)

👥 Community

Join our growing community of developers building with WebAI.js!

  • 💬 Discord - Get help, share projects, understand AI trends, and help shape the future of Web AI
  • 💡 Discussions - Share ideas and feature requests

🤝 Contributing

We welcome contributions! You're invited to add more AI models to our platform and contribute to the library.

👨🏻‍💻 We are currently working on our Contributing Guide. In the meantime, feel free to join our Discord to discuss how you can contribute!

🐛 For model-specific issues, bugs, or feature requests, please visit the model issues page.

📄 License

Apache 2.0 - see LICENSE file for details

🔗 Links

🙏 Acknowledgments

Built with:


Made with ❤️ by Peng Zhang