npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

wasp

v1.0.11

Published

UOMI Agent development environment setup tool

Readme

🚀 WASP

Wasp Logo

Rust WebAssembly Node.js

A powerful development environment for creating UOMI agents using WebAssembly and Rust 🦀

📖 Overview

This development environment allows you to create, test, and debug UOMI agents using WebAssembly (WASM) and Rust. The environment provides seamless integration with both UOMI and third-party LLM services, supporting multiple model configurations and API formats.

🌟 Features

  • 🔄 Hot-reloading development environment
  • 📝 Interactive console for testing
  • 🐛 Built-in debugging capabilities
  • 🔍 Response analysis tools
  • 💾 Conversation history management
  • 🔌 Support for multiple LLM providers
  • 🔑 Secure API key management
  • 📊 Performance metrics tracking

🛠 Prerequisites

Before you begin, ensure you have the following installed:

  • Rust (latest stable version)
  • Node.js (v14 or higher)
  • WebAssembly target: rustup target add wasm32-unknown-unknown

🚀 Getting Started

Option 1: Quick Start with NPX

# Create a new UOMI agent project
npx wasp create

Option 2: Manual Setup

git clone https://github.com/Uomi-network/uomi-chat-agent-template.git
cd uomi-chat-agent-template/agent
npm install
chmod +x ./bin/build_and_run_host.sh
npm start

🔧 Configuration

Model Configuration

The environment supports multiple model configurations through uomi.config.json:

{
  "local_file_path": "path/to/input.txt",
  "api": {
    "timeout_ms": 30000,
    "retry_attempts": 3,
    "headers": {
      "Content-Type": "application/json",
      "Accept": "application/json",
      "User-Agent": "UOMI-Client/1.0"
    }
  },
  "models": {
    "1": {
      "name": "Qwen/Qwen2.5-32B-Instruct-GPTQ-Int4"
    },
    "2": {
      "name": "gpt-3.5-turbo",
      "url": "https://api.openai.com/v1/chat/completions",
      "api_key": "your-api-key-here"
    }
  },
  "ipfs": {
    "gateway": "https://ipfs.io/ipfs",
    "timeout_ms": 10000
  }
}

you can run the node-ai service following this repository node-ai

doing that you don't need to specify any url/api_key in the models configuration, you will run the production version of the node-ai service.

if you don't have enough resources to run the node-ai service you can use a third-party service like openai, in this case you need to specify the url and the api_key in the models configuration.

Response Formats

The environment automatically handles different response formats:

UOMI Format

{
  "response": "Hello, how can I help?",
  "time_taken": 1.23,
  "tokens_per_second": 45,
  "total_tokens_generated": 54
}

OpenAI Format

{
  "choices": [{
    "message": {
      "content": "Hello, how can I help?"
    }
  }],
  "usage": {
    "total_tokens": 150,
    "prompt_tokens": 50,
    "completion_tokens": 100
  }
}

💡 Usage Examples

Interactive Mode

$ npm start
UOMI Development Environment
Type your messages. Use these commands:
/clear - Clear conversation history
/history - Show conversation history
/exit - Exit the program

You: Hello, how are you?
Assistant: Hello! I'm doing well, thank you for asking...

Performance Metrics:
- Time taken: 1.20s
- Tokens/second: 45
- Total tokens: 54

Development

Custom Model Integration

// Add a new model in uomi.config.json
{
  "models": {
    "3": {
      "name": "custom-model",
      "url": "https://api.custom-provider.com/v1/chat",
      "api_key": "your-api-key"
    }
  }
}

📊 Performance Monitoring

The environment provides detailed performance metrics:

  • Response time tracking
  • Token usage statistics
  • Rate limiting information
  • Error tracking and retry statistics

🔐 Security

  • API keys are stored securely in configuration files
  • Support for environment variable substitution
  • Automatic header management for authentication
  • Secure HTTPS communication

🐛 Debugging

Built-in debugging features:

  • Detailed WASM logging
  • Request/response inspection
  • Performance profiling
  • Error tracing with retry information

📚 API Reference

Host Functions

| Function | Description | |----------|-------------| | get_input() | Read input data | | set_output() | Set output data | | call_service_api() | Make API calls with retry support | | get_file_from_cid() | Fetch IPFS content | | log() | Debug logging |

Compiled WASM

The compiled WASM file after test is located in the host/src/agent_template.wasm directory.

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.