npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

tinynn

v1.0.0

Published

A lightweight, optimized neural network library for Node.js with zero dependencies

Downloads

96

Readme

TinyNN

A lightweight, highly optimized neural network library for Node.js with zero dependencies.

Features

  • Zero Dependencies - No external packages required
  • High Performance - Optimized with typed arrays (Float64Array)
  • Lightweight - Minimal footprint, easy to integrate
  • Educational - Well-documented code with detailed comments
  • Simple API - Easy to use and understand
  • Flexible - Support for custom network architectures and activation functions

Installation

npm install tinynn

API Reference

tinynn(architecture, activationFunction)

Creates a new neural network.

Parameters:

  • architecture (Array): Array of layer sizes, e.g., [784, 128, 64, 10]
  • activationFunction (Function): Activation function for hidden layers (e.g., relu)

Returns: Network object

Network Methods

| Method | Description | |-----------------------------|----------------------------------------------------------------| | train(input, target) | Forward pass + backpropagation. Returns softmax probabilities. | | updateWeights(rate, size) | Updates weights using accumulated gradients from training. | | forward(input) | Forward pass only, without computing gradients. | | output(transform) | Get network output, optionally applying transform function. | | getWeights() | Export all weights as 3D array [layer][neuron][weight]. | | setWeights(weights) | Load weights into the network. |

Utility Functions

| Function | Description | |------------------------------------|-------------------------------------------------------------| | relu(x) | ReLU activation function: max(0, x) | | softmax(array) | Convert logits to probability distribution. | | crossEntropyLoss(output, target) | Compute cross-entropy loss between predictions and targets. |

Example: MNIST Digit Recognition

import tinynn from 'tinynn';
import { relu, softmax, crossEntropyLoss } from 'tinynn/utils';

// Create network
const network = tinynn([784, 64, 64, 10], relu);

// Training parameters
const LEARNING_RATE = 0.005;
const BATCH_SIZE = 20;

// Training loop
for (let batch = 0; batch < totalBatches; batch++) {
    const images = loadBatch(batch); // Your data loading function

    for (const image of images) {
        // Normalize input
        const normalizedInput = normalizePixels(image.pixels);

        // One-hot encode label
        const target = new Float64Array(10);
        target[image.label] = 1;

        // Train
        const output = network.train(normalizedInput, target);

        // Calculate loss
        const loss = crossEntropyLoss(output, target);
    }

    // Update weights after batch
    network.updateWeights(LEARNING_RATE, BATCH_SIZE);
}

// Save trained weights
const weights = network.getWeights();

Running the Demo

The repository includes a full MNIST training demo:

# Install dev dependencies (canvas for image processing in demo)
npm install

# Run the MNIST training demo
npm run demo

The demo trains a neural network to recognize handwritten digits from the MNIST dataset.

Architecture

TinyNN implements a feedforward neural network with:

  • He initialization for weights (optimized for ReLU activation)
  • Mini-batch gradient descent for training
  • Backpropagation for computing gradients
  • Typed arrays (Float64Array) for performance
  • Softmax + Cross-Entropy loss for classification

Performance

Training on MNIST (60,000 images):

  • ~1,000+ images/second on modern hardware
  • Reaches 90%+ accuracy within minutes
  • Memory efficient with typed arrays

Why Zero Dependencies?

  • Security: No supply chain vulnerabilities
  • Reliability: No breaking changes from dependencies
  • Performance: No overhead from external packages
  • Bundle Size: Minimal footprint for browsers/edge computing
  • Maintenance: Easier to maintain and audit

License

MIT License - see LICENSE file for details

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Repository

https://github.com/paulhodel/tinynn