npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

deepnet.js

v1.1.2

Published

Auto-differentiation library for javascript

Readme

deepnet.js

unit test npm

deepnet.js is an auto-differentiation library for javascript. it will compute the gradients in both static and dynamic method.

you can see the API-doc here.

:warning: Deepnet.js is reimplemented from ground to support sparse tensor, broadcast and various backends are in development, If you are using the older versions (1.0.2 or below), please upgrade your code accordingly.

Installation

NPM

npm install deepnet.js

CDN

https://unpkg.com/deepnet.js@latest/dist/deepnet-browser.min.js
https://unpkg.com/deepnet.js@latest/dist/deepnet-browser.js

Usage

Node

const deepnet = require("deepnet.js");

deepnet.platforms.cpu().then((backend) => {
    
    let dn = backend;

    const a = dn.tensor([1, 2, 3, 4], [2, 2]);
    const b = dn.tensor([1, 2, 3, 4], [2, 2]);
    
    const result = dn.matmul( a, b );
    result.print();

})

CDN

<script src="https://unpkg.com/deepnet.js@latest/dist/deepnet-browser.min.js"></script>
<script>

    deepnet.platforms.cpu().then((backend) => {
    
        let dn = backend;

        const a = dn.tensor([1, 2, 3, 4], [2, 2]);
        const b = dn.tensor([1, 2, 3, 4], [2, 2]);

        const result = dn.matmul( a, b );
        result.print();

    })

</script>

Support 🔥

If this package helps you, please consider supporting me. Your donation will inspire me to work on this project.

paypal.me

Thanks for considering to donate.


Getting started

Full API-doc

you can see the API-doc here.

Table of Contents

Autodiff

Autodiff (Automatic Differentitation) is a technique which uses a computational graph to compute a derivatives automatically.

In the forward phase, it executes the math operation and constructs the computational graph, and In the backward phase, the dervatives are computed automatically.

Platforms/Backends

Platforms adds support for various environments to run your neural networks.

  • cpu(..), (js-environment) pure js implementaion for the browser and nodejs.
deepnet.platforms.cpu().then((dn) => {        
    dn.tensor(..)
    dn.add(..)
    ...    
}); 

Tensor

A tensor is a scalar or a vector or a multidimensional array. it can be created using dn.tensor(..)

deepnet.platforms.cpu().then((dn) => {
    
    let dense = dn.tensor([1, 2, 3, 4], [2, 2], is_sparse = false);    
    dense.print();

    let sparse = dn.tensor([1, 2, 0, 4, 5, 6, 0, 0], [2, 2, 2], is_sparse = true);    
    sparse.print();

});

Contains:

  • value Initial value for the tensor.

    • data Array of numbers.

    • shape The shape of the tensor. if it is not defined, it will be automatically found from data.

  • grad Stores the derivation of the tensor, it will be filled while backpass().

    • data Array of numbers.

    • shape The shape of the grad. if it is not defined, it will be automatically found from data.

  • parents Stores the Parents (vertex[]), from which it is created.

  • print() prints the tensor.

  • feed() Recalculates the values respectively, it will fills the value.

  • back() Calculates the derivation, it will fills the grad.

  • is_sparse it is used specify whether the tensor to be created is sparse or dense. default false.

Operations

Broadcasting

deepnet.js supports broadcasting, no copy or temporary variables are created.

basic_operation

deepnet.platforms.cpu().then((dn) => {
    
    // supports broadcasting
    const a = dn.ones([2, 2, 2]);
    const b = dn.zeros([2, 2]);
        
    const res = dn.add(a, b);
    // dn.sub(..), dn.mul(..), dn.div(..) 

    res.print();
    
})

Output:

Tensor
[[[1 1]
  [1 1]]

 [[1 1]
  [1 1]]]

Matmul_two_Tensors

deepnet.platforms.cpu().then((dn) => {
    
    // supports broadcasting
    const a = dn.ones([2, 2, 2]);
    const b = dn.ones([2, 2]);
        
    const res = dn.matmul(a, b);
    res.print();
    
})

Output:

Tensor
[[[2 2]
  [2 2]]

 [[2 2]
  [2 2]]]

fully_connected_example

const deepnet = require("deepnet.js");

(async () => {  

    // importing the "cpu" class
    let dn = await deepnet.platforms.cpu();

    // declaring input
    let a = dn.tensor([1, 2, 3, 4], [2, 2]);

    // declaring weights
    let w = dn.randn([2, 2]);
    let b = dn.randn([2, 2]);

    // single nn layer - (linear layer)
    // returns sigmoid( a @ w + b )
    const feed = (a, w, b) => {
        let mat_res = dn.matmul(a, w);
        let added = dn.add(mat_res, b);
        return dn.sig(added);
    }

    // Stochastic gradient descent optimizer
    let optm = dn.optimizer.SGD([w, b], 0.04);
    
    for (let i = 0; i < 300; i++) {
        
        let result = feed(a, w, b);
        let loss = dn.sub(result, dn.tensor([0, 1, 0, 1], [2, 2]));              
        
        dn.backpass(result, loss); // calculating derivatives
        optm.step(); // updating weights
        dn.grad_zero(result); // reseting the grad

    }

    // Predicting the values
    let pred = feed(a, w, b);
    pred.print();

})();

Output:

Tensor
[[0.08579222069069875 0.905562796140377]
 [0.005928221665889459 0.9933268076751989]]