npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@unitypredict/unitypredict-client

v1.0.0

Published

A Node.js client library for making predictions using the UnityPredict API. Handles both simple predictions and complex workflows involving file uploads with configurable timeouts and exponential backoff.

Readme

UnityPredict JavaScript Client

A Node.js client library for making predictions using the UnityPredict API. This client handles both simple predictions and complex workflows involving file uploads, with configurable timeouts and exponential backoff for efficient polling of long-running predictions.

Installation

npm install unitypredictjsclient

Or if using locally:

npm install ./UnityPredictJSClient

Usage

Basic Setup

const { UnityPredictClient, PredictionRequest, LocalFile } = require('unitypredictjsclient');

// Initialize the client with your API key
const client = new UnityPredictClient('your-api-key-here');

// Initialize with verbose logging enabled
const verboseClient = new UnityPredictClient('your-api-key-here', { verboseLog: true });

Simple Prediction (No File Inputs)

const request = new PredictionRequest({
    inputValues: {
        text: 'Hello world',
        temperature: 0.7
    },
    desiredOutcomes: ['output']
});

const result = await client.predict('model-id', request);
console.log('Results:', result.outcomes);

Prediction with File Upload

const localFile = new LocalFile('/path/to/audio.mp3');
const fileRequest = new PredictionRequest({
    inputValues: {
        'Audio File': localFile
    },
    desiredOutcomes: ['Transcription', 'Language']
});

// Download output files to a folder
const fileResult = await client.predict('model-id', fileRequest, '/output/folder');
console.log('Transcription:', fileResult.outcomes.Transcription);

Asynchronous Predictions

For long-running predictions, you can use the async methods:

// Start an asynchronous prediction
const asyncRequest = new PredictionRequest({
    inputValues: {
        prompt: 'Write a long story about a robot',
        max_tokens: 2000
    },
    desiredOutcomes: ['generated_text']
});

const asyncResponse = await client.asyncPredict('model-id', asyncRequest);

if (asyncResponse.status === 'Processing') {
    // Check status later
    const finalResult = await client.getRequestStatus(asyncResponse.requestId);
    console.log('Final results:', finalResult.outcomes);
}

Check Request Status

const status = await client.getRequestStatus('request-id-123');

switch (status.status?.toLowerCase()) {
    case 'processing':
        console.log('Prediction is still processing...');
        break;
    case 'completed':
        console.log('Prediction completed successfully!');
        console.log(`Results: ${Object.keys(status.outcomes || {}).join(', ')}`);
        console.log(`Compute cost: ${status.computeCost}`);
        break;
    case 'error':
        console.log(`Prediction failed: ${status.errorMessages}`);
        break;
}

Long-Running Predictions with Custom Timeout

const longRequest = new PredictionRequest({
    inputValues: {
        prompt: 'Generate a very long story',
        max_tokens: 5000
    },
    desiredOutcomes: ['generated_text']
});

// 10 minutes timeout (600 seconds)
const longResult = await client.predict('text-model-id', longRequest, null, 600);

API Reference

UnityPredictClient

Constructor

new UnityPredictClient(apiKey, options)
  • apiKey (string, required): Your UnityPredict API key
  • options (object, optional):
    • verboseLog (boolean): Enable verbose logging (default: false)
    • baseUrl (string): Custom base URL for the API (default: production URL)

Methods

predict(modelId, request, outputFolderPath, pollingTimeoutSeconds)

Sends a synchronous prediction request. Automatically handles polling for long-running predictions.

  • modelId (string): The model ID to use
  • request (PredictionRequest): The prediction request
  • outputFolderPath (string, optional): Path to download output files
  • pollingTimeoutSeconds (number, optional): Timeout for polling (default: 900 seconds)

Returns: Promise<PredictionResponse>

asyncPredict(modelId, request, outputFolderPath)

Initiates an asynchronous prediction. Returns immediately with a request ID.

  • modelId (string): The model ID to use
  • request (PredictionRequest): The prediction request
  • outputFolderPath (string, optional): Path to download output files

Returns: Promise<PredictionResponse>

getRequestStatus(requestId, outputFolderPath)

Retrieves the status of an asynchronous prediction request.

  • requestId (string): The request ID to check
  • outputFolderPath (string, optional): Path to download output files

Returns: Promise<PredictionResponse>

PredictionRequest

new PredictionRequest({
    inputValues: { /* your inputs */ },
    desiredOutcomes: ['outcome1', 'outcome2'],
    contextId: 'optional-context-id',
    requestId: 'optional-request-id' // Used internally
})

LocalFile

Wrapper for local files to be uploaded:

const file = new LocalFile('/path/to/file.ext');

PredictionResponse

Response object containing:

  • requestId (string): Request tracking ID
  • status (string): Status ('Processing', 'Completed', 'Error')
  • outcomes (object): Prediction results
  • computeCost (number): Compute cost
  • computeTime (string): Compute time
  • errorMessages (string): Error messages if any

UnityPredictException

Custom exception thrown when API operations fail.

Environment Configuration

The client automatically uses the development environment when NODE_ENV is set to development or dev:

NODE_ENV=development node your-script.js

Features

  • ✅ Synchronous and asynchronous prediction support
  • ✅ Automatic file upload handling
  • ✅ Exponential backoff polling for long-running predictions
  • ✅ Automatic file download for output files
  • ✅ Comprehensive error handling
  • ✅ Verbose logging option
  • ✅ Environment-based configuration (dev/prod)

Requirements

  • Node.js >= 12.0.0

License

ISC