npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@thi.ng/tensors

v0.10.9

Published

0D/1D/2D/3D/4D tensors with extensible polymorphic operations and customizable storage

Readme

@thi.ng/tensors

npm version npm downloads Mastodon Follow

[!NOTE] This is one of 211 standalone projects, maintained as part of the @thi.ng/umbrella monorepo and anti-framework.

🚀 Please help me to work full-time on these projects by sponsoring me on GitHub. Thank you! ❤️

About

0D/1D/2D/3D/4D tensors with extensible polymorphic operations and customizable storage.

[!NOTE] This package contains code originally written in 2017/18 and has been refactored to be stylistically more aligned with other thi.ng packages.

Built-in tensor operations

The ITensor interface shared by all tensor implementations provides the following methods (non-exhaustive list here):

The set of tensor polymorphic component-wise ops is easily extensible via provided higher-order functions in the defOpXX() family. Most of the ops listed below are also based on this approach. The function signatures and naming conventions are closely aligned to the ones used by the thi.ng/vectors package.

  • abs: Componentwise Math.abs
  • add: Tensor-tensor addition
  • addN: Tensor-scalar addition
  • argMax: Maximum component index/value
  • argMin: Minimum component index/value
  • clamp: Tensor-tensor interval clamping
  • clampN: Tensor-scalar interval clamping
  • convolve: Tensor convolution (1D/2D/3D only)
  • cos: Componentwise Math.cos
  • diagonal: Diagonal extraction
  • div: Tensor-tensor division
  • divN: Tensor-scalar division
  • dot: Dot product
  • exp: Componentwise Math.exp
  • exp2: Componentwise 2^x
  • identity: Square identity matrix tensor
  • integrate: Integrate tensor along innermost dimension
  • log: Componentwise Math.log
  • log2: Componentwise Math.log2
  • mag: Tensor magnitude
  • magSq: Tensor squared magnitude
  • max: Tensor-tensor maximum
  • maxN: Tensor-scalar maximum
  • mean: Tensor mean value
  • min: Tensor-tensor minimum
  • minN: Tensor-scalar maximum
  • mul: Tensor-tensor multiplication
  • mulN: Tensor-scalar multiplication
  • mulM: Matrix-matrix product
  • mulV: Matrix-vector product
  • negativeIndices: Indices of negative component values
  • nonZeroIndices: Indices of non-zero component values
  • normalize: Tensor normalization (w/ optional length)
  • ones: One-filled tensor creation
  • positiveIndices: Indices of positive component values
  • pow: Tensor-tensor Math.pow
  • powN: Tensor-scalar Math.pow
  • print: Formatted tensor output
  • product: Component product
  • randDistrib: Fill with random data from distribution fn
  • range: Create 1D tensor of monotonically increasing/decreasing values
  • relu: ReLU activation
  • reluN: Leaky ReLU activation
  • select: Generalization of argMin/Max
  • set: Tensor setter
  • setN: Tensor setter w/ uniform scalar
  • sigmoid: Sigmoid activation
  • sin: Componentwise Math.sin
  • smoothStep: Smooth threshold function (as as GLSL smoothstep())
  • smoothStepN: Smooth threshold function (as as GLSL smoothstep())
  • softMax: Soft Max activation
  • sqrt: Componentwise Math.sqrt
  • step: Threshold function (as as GLSL step())
  • stepN: Threshold function (as as GLSL step())
  • sub: Tensor-tensor subtraction
  • subN: Tensor-scalar subtraction
  • sum: Component sum
  • svd: Singular value decomposition
  • swap: Swap tensor values
  • tan: Componentwise Math.tan
  • tanh: Componentwise Math.tanh
  • trace: Matrix trace (diagonal component sum)
  • zeroes: Zero-filled tensor creation

Broadcasting support

Most of the built-in functions taking two or more tensors as input are supporting broadcasting, i.e. the shapes of the individual arguments only need to be compatible, not identical. The operators attempt to adjust the tensor shape & stride configurations to be compatible, applying the steps and rules below:

  • If the dimensions are unequal, the smaller tensor's dimensions will be increased as needed. The size of each added dimension will be set to 1 and its stride set to zero.
  • The size of each dimension will be compared and only the following cases are accepted (otherwise will throw an error): sizes are equal or one side is 1
  • Any of the tensors requiring shape adjustments will be shallow copied with new shape/stride config applied.

Some examples:

import { add, sub, print, tensor } from "@thi.ng/tensors";

// 2D + 1D
print(add(null, tensor([[1,2], [3,4]]), tensor([10, 20])));
//   11.0000   22.0000
//   13.0000   24.0000

// 2D + 1D (as column vector)
print(add(null, tensor([[1, 2], [3, 4]]), tensor([10, 20]).reshape([2,1])));
//   11.0000   12.0000
//   23.0000   24.0000

// 1D - 2D
print(sub(null, tensor([10, 20]), tensor([[1,2], [3,4]])));
//    9.0000   18.0000
//    7.0000   16.0000

// 1D + 3D
print(add(null, tensor([10, 20]), tensor([[[1, 2], [3, 4]], [[5, 6], [7, 8]]])));
// --- 0: ---
//   11.0000   22.0000
//   13.0000   24.0000
// --- 1: ---
//   15.0000   26.0000
//   17.0000   28.0000

// 2D + 3D
print(add(null, tensor([[10, 20], [100, 200]]), tensor([[[1, 2], [3, 4]], [[5, 6], [7, 8]]])));
// --- 0: ---
//   11.0000   22.0000
//  103.0000  204.0000
// --- 1: ---
//   15.0000   26.0000
//  107.0000  208.0000

Convolution support

Tensor convolution is only possible if both the domain tensor and the kernel tensor have same dimensionality. No broadcasting support.

The following kernel presets and tensor factories are included and can be used with convolve():

  • BOX_BLUR2(radius): Box blur kernel factory
  • GAUSSION2(radius): Gaussian blur kernel factory
  • EDGE2(radius): Edge/ridge detection kernel factory
  • SOBEL1: 1D Sobel kernel
  • SOBEL2: 2D Sobel kernel
  • SOBEL3: 3D Sobel kernel

For more generalized convolution-like functionality, the following kernel factories can be used with applyKernel():

  • MAX2_POOL(width,height?): max pooling
  • MIN2_POOL(width,height?): min pooling
  • MAXIMA2(radius): local maxima detection
  • MINIMA2(radius): local minima detection

Conversions

The following functions can be used to convert/coerce other data structures into tensors:

  • asTensor(): Convert/wrap data as tensor
  • fromFloatBuffer(): Coerce [thi.ng/pixel] float buffer/image (or compatible data structures) into a 2D/3D tensor

Status

ALPHA - bleeding edge / work-in-progress

Search or submit any issues for this package

Installation

yarn add @thi.ng/tensors

ESM import:

import * as ten from "@thi.ng/tensors";

Browser ESM import:

<script type="module" src="https://esm.run/@thi.ng/tensors"></script>

JSDelivr documentation

For Node.js REPL:

const ten = await import("@thi.ng/tensors");

Package sizes (brotli'd, pre-treeshake): ESM: 11.49 KB

Dependencies

Note: @thi.ng/api is in most cases a type-only import (not used at runtime)

API

Generated API docs

TODO

Basic usage

import * as t from "@thi.ng/tensors";

// create 4x4x4 3D tensor and fill with values
const a = t.range(64).reshape([4, 4, 4]);

t.print(a);
// --- 0: ---
//         0    1.0000    2.0000    3.0000
//    4.0000    5.0000    6.0000    7.0000
//    8.0000    9.0000   10.0000   11.0000
//   12.0000   13.0000   14.0000   15.0000
// --- 1: ---
//   16.0000   17.0000   18.0000   19.0000
//   20.0000   21.0000   22.0000   23.0000
//   24.0000   25.0000   26.0000   27.0000
//   28.0000   29.0000   30.0000   31.0000
// --- 2: ---
//   32.0000   33.0000   34.0000   35.0000
//   36.0000   37.0000   38.0000   39.0000
//   40.0000   41.0000   42.0000   43.0000
//   44.0000   45.0000   46.0000   47.0000
// --- 3: ---
//   48.0000   49.0000   50.0000   51.0000
//   52.0000   53.0000   54.0000   55.0000
//   56.0000   57.0000   58.0000   59.0000
//   60.0000   61.0000   62.0000   63.0000

// pick a tensor slice/axis (view only, 2d tensor)
t.print(a.pick([3]));
//   48.0000   49.0000   50.0000   51.0000
//   52.0000   53.0000   54.0000   55.0000
//   56.0000   57.0000   58.0000   59.0000
//   60.0000   61.0000   62.0000   63.0000

// any axis set to -1 will be skipped
// here we select slice 3 and column 2 only (1d tensor)
t.print(a.pick([3, -1, 2]));
//   50.0000   54.0000   58.0000   62.0000

// use `.pack()` to apply view to standalone densely packed tensor (own data)
console.log(a.pick([3, 2]).pack().data);
// [ 56, 57, 58, 59 ]

// only select every second value along each axis
t.print(a.step([2, 2, 2]));
// --- 0: ---
//         0    2.0000
//    8.0000   10.0000
// --- 1: ---
//   32.0000   34.0000
//   40.0000   42.0000

// extract an axis range (view only, use `.pack()` to extract)
t.print(a.lo([1, 1, 1]).hi([2, 2, 2]));
// --- 0: ---
//   21.0000   22.0000
//   25.0000   26.0000
// --- 1: ---
//   37.0000   38.0000
//   41.0000   42.0000

// read & write elements (no bounds checking!)
a.set([1, 2, 3], 100);

console.log(a.get([1, 2, 3]));
// 100

// or via direct array access
console.log(a.data[a.index([1, 2, 3])]);
// 100

// tensors are iterables (in current stride order)
console.log([...a]);
// [ 0, 1, 2, 3, 4, ... 60, 61, 62, 63 ]

// create a 2D tensor w/ random values (by default normal distribution, bias=0, std=1)
const b = t.randDistrib(t.tensor("f64", [4, 2]));

t.print(b);
//    0.3854    0.6597
//    0.5775    0.9201
//   -0.7276   -0.1069
//    1.0550    0.4903

// apply sigmoid
// (null as output arg means mutate original [in 99% of all provided ops])
t.print(t.sigmoid(null, b));
//    0.5952    0.6592
//    0.6405    0.7151
//    0.3257    0.4733
//    0.7417    0.6202

Matrix-matrix multiplication

import { tensor, mulM, print } from "@thi.ng/tensors";

// create 2x3 matrix
const m1 = tensor([[1, 2, 3], [4, 5, 6]]);

print(m1);
//    1.0000    2.0000    3.0000
//    4.0000    5.0000    6.0000

// create transposed view (view only, zero-copy)
const m2 = m1.transpose([1, 0]);

print(m2);
//    1.0000    4.0000
//    2.0000    5.0000
//    3.0000    6.0000

// matrix multiplication
// (here if 1st arg is null, a new tensor will be created)
print(mulM(null, m1, m2));
//   14.0000   32.0000
//   32.0000   77.0000

Matrix-vector multiplication

import { tensor, mulV, print } from "@thi.ng/tensors";

// create 2x3 transformation matrix (row-major)
const mat = tensor([[10, 0, 100], [0, 5, 200]]);

print(mat);
//   10.0000         0  100.0000
//         0    5.0000  200.0000

// create vector
const vec = tensor([1, 1, 1]);

// matrix-vector multiply
print(mulV(null, mat, vec));
//  110.0000  205.0000

Authors

If this project contributes to an academic publication, please cite it as:

@misc{thing-tensors,
  title = "@thi.ng/tensors",
  author = "Karsten Schmidt",
  note = "https://thi.ng/tensors",
  year = 2018
}

License

© 2018 - 2025 Karsten Schmidt // Apache License 2.0