tinynn
v1.0.0
Published
A lightweight, optimized neural network library for Node.js with zero dependencies
Downloads
96
Maintainers
Readme
TinyNN
A lightweight, highly optimized neural network library for Node.js with zero dependencies.
Features
- Zero Dependencies - No external packages required
- High Performance - Optimized with typed arrays (Float64Array)
- Lightweight - Minimal footprint, easy to integrate
- Educational - Well-documented code with detailed comments
- Simple API - Easy to use and understand
- Flexible - Support for custom network architectures and activation functions
Installation
npm install tinynnAPI Reference
tinynn(architecture, activationFunction)
Creates a new neural network.
Parameters:
architecture(Array): Array of layer sizes, e.g.,[784, 128, 64, 10]activationFunction(Function): Activation function for hidden layers (e.g.,relu)
Returns: Network object
Network Methods
| Method | Description |
|-----------------------------|----------------------------------------------------------------|
| train(input, target) | Forward pass + backpropagation. Returns softmax probabilities. |
| updateWeights(rate, size) | Updates weights using accumulated gradients from training. |
| forward(input) | Forward pass only, without computing gradients. |
| output(transform) | Get network output, optionally applying transform function. |
| getWeights() | Export all weights as 3D array [layer][neuron][weight]. |
| setWeights(weights) | Load weights into the network. |
Utility Functions
| Function | Description |
|------------------------------------|-------------------------------------------------------------|
| relu(x) | ReLU activation function: max(0, x) |
| softmax(array) | Convert logits to probability distribution. |
| crossEntropyLoss(output, target) | Compute cross-entropy loss between predictions and targets. |
Example: MNIST Digit Recognition
import tinynn from 'tinynn';
import { relu, softmax, crossEntropyLoss } from 'tinynn/utils';
// Create network
const network = tinynn([784, 64, 64, 10], relu);
// Training parameters
const LEARNING_RATE = 0.005;
const BATCH_SIZE = 20;
// Training loop
for (let batch = 0; batch < totalBatches; batch++) {
const images = loadBatch(batch); // Your data loading function
for (const image of images) {
// Normalize input
const normalizedInput = normalizePixels(image.pixels);
// One-hot encode label
const target = new Float64Array(10);
target[image.label] = 1;
// Train
const output = network.train(normalizedInput, target);
// Calculate loss
const loss = crossEntropyLoss(output, target);
}
// Update weights after batch
network.updateWeights(LEARNING_RATE, BATCH_SIZE);
}
// Save trained weights
const weights = network.getWeights();Running the Demo
The repository includes a full MNIST training demo:
# Install dev dependencies (canvas for image processing in demo)
npm install
# Run the MNIST training demo
npm run demoThe demo trains a neural network to recognize handwritten digits from the MNIST dataset.
Architecture
TinyNN implements a feedforward neural network with:
- He initialization for weights (optimized for ReLU activation)
- Mini-batch gradient descent for training
- Backpropagation for computing gradients
- Typed arrays (Float64Array) for performance
- Softmax + Cross-Entropy loss for classification
Performance
Training on MNIST (60,000 images):
- ~1,000+ images/second on modern hardware
- Reaches 90%+ accuracy within minutes
- Memory efficient with typed arrays
Why Zero Dependencies?
- Security: No supply chain vulnerabilities
- Reliability: No breaking changes from dependencies
- Performance: No overhead from external packages
- Bundle Size: Minimal footprint for browsers/edge computing
- Maintenance: Easier to maintain and audit
License
MIT License - see LICENSE file for details
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
