npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@sipemu/anofox-regression

v0.5.1

Published

Statistical regression library - WebAssembly bindings for anofox-regression

Readme

@sipemu/anofox-regression

WebAssembly bindings for anofox-regression, a comprehensive statistical regression library.

Features

Linear Models

  • OLS Regression - Ordinary Least Squares with full inference (standard errors, p-values, confidence intervals)
  • WLS Regression - Weighted Least Squares for heteroscedastic data
  • Ridge Regression - L2 regularization for handling multicollinearity
  • Elastic Net - Combined L1/L2 regularization (Lasso + Ridge)
  • BLS Regression - Bounded/Non-Negative Least Squares (Lawson-Hanson algorithm)
  • PLS Regression - Partial Least Squares (SIMPLS) for collinear data
  • RLS Regression - Recursive Least Squares for online learning

Quantile & Monotonic

  • Quantile Regression - Estimate conditional quantiles (median, quartiles, etc.)
  • Isotonic Regression - Monotonic regression using Pool Adjacent Violators Algorithm

Generalized Linear Models (GLM)

  • Poisson Regression - For count data (log/identity/sqrt link)
  • Binomial Regression - Logistic/Probit for binary outcomes
  • Negative Binomial - For overdispersed count data
  • Tweedie Regression - Flexible variance (Gamma, Compound Poisson-Gamma, etc.)

Augmented Linear Model (ALM)

  • ALM Regression - Maximum likelihood with various distributions (Normal, Laplace, Student-t, Gamma, etc.)

Time-Varying Models

  • LmDynamic - Dynamic Linear Model with time-varying parameters using information criteria weighting

Installation

npm install @sipemu/anofox-regression

Usage

Browser (ES Modules)

import init, {
  OlsRegressor, RidgeRegressor, QuantileRegressor,
  BlsRegressor, PlsRegressor, RlsRegressor, AlmRegressor, LmDynamicRegressor
} from '@sipemu/anofox-regression';

async function main() {
  // Initialize the WASM module
  await init();

  // Create sample data (row-major flat array)
  const x = new Float64Array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]);  // 5 rows, 2 cols
  const y = new Float64Array([2.1, 3.9, 6.2, 7.8, 10.1]);

  // Fit OLS regression
  const ols = new OlsRegressor();
  ols.setWithIntercept(true);
  ols.setComputeInference(true);

  const fitted = ols.fit(x, 5, 2, y);

  // Get results
  const result = fitted.getResult();
  console.log('R-squared:', result.rSquared);
  console.log('Coefficients:', result.coefficients);
  console.log('P-values:', result.pValues);

  // Make predictions
  const xNew = new Float64Array([11, 12]);
  const predictions = fitted.predict(xNew, 1);
  console.log('Prediction:', predictions);
}

main();

Node.js

import { readFile } from 'fs/promises';
import { initSync, OlsRegressor } from '@sipemu/anofox-regression';

// Load and initialize WASM synchronously
const wasmBuffer = await readFile('./node_modules/@sipemu/anofox-regression/anofox_regression_js_bg.wasm');
initSync(wasmBuffer);

// Use the library
const ols = new OlsRegressor();
// ...

API Reference

OlsRegressor

Ordinary Least Squares regression with full statistical inference.

class OlsRegressor {
  constructor();
  setWithIntercept(include: boolean): void;
  setComputeInference(compute: boolean): void;
  setConfidenceLevel(level: number): void;  // default: 0.95
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedOls;
}

class FittedOls {
  getResult(): OlsResult;
  getCoefficients(): Float64Array;
  getIntercept(): number | undefined;
  getRSquared(): number;
  predict(x: Float64Array, nRows: number): Float64Array;
}

RidgeRegressor

Ridge regression with L2 regularization.

class RidgeRegressor {
  constructor();
  setLambda(lambda: number): void;  // regularization strength
  setWithIntercept(include: boolean): void;
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedRidge;
}

QuantileRegressor

Quantile regression for estimating conditional quantiles.

class QuantileRegressor {
  constructor();
  setTau(tau: number): void;  // quantile to estimate (0 < tau < 1)
  setWithIntercept(include: boolean): void;
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedQuantile;
}

IsotonicRegressor

Isotonic (monotonic) regression.

class IsotonicRegressor {
  constructor();
  setIncreasing(increasing: boolean): void;
  setOutOfBounds(mode: 'clip' | 'nan' | 'extrapolate'): void;
  fit(x: Float64Array, y: Float64Array): FittedIsotonic;  // 1D data only
}

WlsRegressor

Weighted Least Squares regression.

class WlsRegressor {
  constructor();
  setWeights(weights: Float64Array): void;
  setWithIntercept(include: boolean): void;
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedWls;
}

ElasticNetRegressor

Elastic Net with L1+L2 regularization.

class ElasticNetRegressor {
  constructor();
  setLambda(lambda: number): void;  // regularization strength
  setAlpha(alpha: number): void;    // L1/L2 mix (0=Ridge, 1=Lasso)
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedElasticNet;
}

PoissonRegressor

Poisson GLM for count data.

class PoissonRegressor {
  constructor();
  setLink(link: 'log' | 'identity' | 'sqrt'): void;
  setWithIntercept(include: boolean): void;
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedPoisson;
}

BinomialRegressor

Logistic/Probit regression for binary outcomes.

class BinomialRegressor {
  constructor();
  setLink(link: 'logit' | 'probit' | 'cloglog'): void;
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedBinomial;
}

NegativeBinomialRegressor

For overdispersed count data.

class NegativeBinomialRegressor {
  constructor();
  setTheta(theta: number): void;      // fixed dispersion
  setEstimateTheta(estimate: boolean): void;  // estimate from data
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedNegativeBinomial;
}

TweedieRegressor

Flexible GLM with Tweedie variance function.

class TweedieRegressor {
  constructor();
  static gamma(): TweedieRegressor;  // Gamma regression
  setVarPower(p: number): void;  // 0=Gaussian, 1=Poisson, 2=Gamma, 3=InvGauss
  setLinkPower(p: number): void; // 0=log, 1=identity
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedTweedie;
}

BlsRegressor

Bounded/Non-Negative Least Squares using Lawson-Hanson algorithm.

class BlsRegressor {
  constructor();
  static nnls(): BlsRegressor;  // Non-negative least squares
  setWithIntercept(include: boolean): void;
  setLowerBoundAll(bound: number): void;  // Same lower bound for all coefficients
  setUpperBoundAll(bound: number): void;  // Same upper bound for all coefficients
  setLowerBounds(bounds: Float64Array): void;  // Per-variable lower bounds
  setUpperBounds(bounds: Float64Array): void;  // Per-variable upper bounds
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedBls;
}

PlsRegressor

Partial Least Squares using SIMPLS algorithm.

class PlsRegressor {
  constructor();
  setNComponents(n: number): void;  // Number of latent components
  setWithIntercept(include: boolean): void;
  setScale(scale: boolean): void;  // Scale X to unit variance
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedPls;
}

class FittedPls {
  getResult(): PlsResult;
  getNComponents(): number;
  transform(x: Float64Array, nRows: number): Float64Array;  // Project to latent space
  predict(x: Float64Array, nRows: number): Float64Array;
}

RlsRegressor

Recursive Least Squares for online learning.

class RlsRegressor {
  constructor();
  setWithIntercept(include: boolean): void;
  setForgettingFactor(lambda: number): void;  // 1.0 = standard RLS, <1 = weight recent data
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedRls;
}

class FittedRls {
  getResult(): RlsResult;
  getForgettingFactor(): number;
  predict(x: Float64Array, nRows: number): Float64Array;
}

AlmRegressor

Augmented Linear Model with various error distributions.

class AlmRegressor {
  constructor();
  setDistribution(dist: string): void;  // 'normal', 'laplace', 'student_t', 'gamma', etc.
  setWithIntercept(include: boolean): void;
  setComputeInference(compute: boolean): void;
  setMaxIterations(maxIter: number): void;
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedAlm;
}

Supported distributions: normal, laplace, student_t, logistic, asymmetric_laplace, generalised_normal, log_normal, log_laplace, gamma, inverse_gaussian, exponential, poisson, negative_binomial, beta, folded_normal, s.

LmDynamicRegressor

Dynamic Linear Model with time-varying parameters.

class LmDynamicRegressor {
  constructor();
  setIc(ic: 'aic' | 'aicc' | 'bic'): void;  // Information criterion
  setWithIntercept(include: boolean): void;
  setLowessSpan(span: number | null): void;  // LOWESS smoothing (null to disable)
  setMaxModels(max: number): void;  // Limit candidate models
  fit(x: Float64Array, nRows: number, nCols: number, y: Float64Array): FittedLmDynamic;
}

class FittedLmDynamic {
  getResult(): LmDynamicResult;
  getDynamicCoefficients(): Float64Array;  // Time-varying coefficients (row-major)
  getDynamicCoefficientsRows(): number;
  getDynamicCoefficientsCols(): number;
  predict(x: Float64Array, nRows: number): Float64Array;
}

Data Format

All matrix data is passed as flat Float64Array in row-major order:

// For a 3x2 matrix:
// [[1, 2],
//  [3, 4],
//  [5, 6]]
const x = new Float64Array([1, 2, 3, 4, 5, 6]);
const nRows = 3;
const nCols = 2;

Development

Running Tests

The package includes a comprehensive test suite using Vitest.

# Install dependencies
npm install

# Run tests
npm test

# Run tests in watch mode
npm run test:watch

# Run tests with coverage
npm run test:coverage

Building from Source

To build the WASM package from the Rust source:

# Install wasm-pack
curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh

# Build the package
wasm-pack build crates/anofox-regression-js --target web --out-dir pkg

License

MIT