npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@datagrok-libraries/sci-comp

v0.1.0

Published

Pure TypeScript library for numerical methods

Readme

Sci Comp

Pure TypeScript library of numerical methods for the Datagrok platform.

Installation

npm install @datagrok-libraries/sci-comp

Optimization

import {singleObjective} from '@datagrok-libraries/sci-comp';

const {NelderMead, PSO, applyPenalty, applyPenaltyAsync, boxConstraints, getOptimizer, listOptimizers} = singleObjective;

Single-objective

Four built-in solvers, each supporting synchronous and asynchronous objective functions:

| Method | Sync | Async | |--------|------|-------| | Minimize | minimize(fn, x0, settings) | minimizeAsync(fn, x0, settings) | | Maximize | maximize(fn, x0, settings) | maximizeAsync(fn, x0, settings) |

Unconstrained minimize

Minimize the Rosenbrock function:

$$f(x, y) = 100,(y - x^2)^2 + (1 - x)^2$$

  • Goal: minimize
  • Start: $x_0 = (-1.2,; 1.0)$
  • Expected: $\min = 0$ at $(1, 1)$
const rosenbrock = (x: Float64Array): number => {
  let sum = 0;
  for (let i = 0; i < x.length - 1; i++)
    sum += 100 * (x[i + 1] - x[i] ** 2) ** 2 + (1 - x[i]) ** 2;
  return sum;
};

const nm = new NelderMead();
const result = nm.minimize(rosenbrock, new Float64Array([-1.2, 1.0]), {
  maxIterations: 5_000,
  tolerance: 1e-12,
});
// result.point ≈ [1, 1], result.value ≈ 0

Unconstrained maximize

Maximize the Gaussian function:

$$f(x, y) = e^{-(x^2 + y^2)}$$

  • Goal: maximize
  • Start: $x_0 = (2,; -3)$
  • Expected: $\max = 1$ at $(0, 0)$
const gaussian = (x: Float64Array): number =>
  Math.exp(-(x[0] ** 2 + x[1] ** 2));

const result = nm.maximize(gaussian, new Float64Array([2, -3]), {
  maxIterations: 5_000,
});
// result.point ≈ [0, 0], result.value ≈ 1

Constrained optimization (penalty method)

Minimize the sphere function with box constraints:

$$f(x, y) = x^2 + y^2$$

  • Goal: minimize
  • Subject to: $2 \le x \le 5,; 2 \le y \le 5$
  • Method: quadratic penalty, $\mu = 10,000$
  • Start: $x_0 = (3, 3)$
  • Expected: $\min = 8$ at $(2, 2)$

Constraints can be passed directly via settings - the penalty is applied with the correct sign for both minimize and maximize:

const sphere = (x: Float64Array): number => {
  let sum = 0;
  for (let i = 0; i < x.length; i++) sum += x[i] ** 2;
  return sum;
};

const nm = new NelderMead();
const result = nm.minimize(sphere, new Float64Array([3, 3]), {
  maxIterations: 5_000,
  constraints: boxConstraints(
    new Float64Array([2, 2]),  // lower bounds
    new Float64Array([5, 5]),  // upper bounds
  ),
  penaltyOptions: {mu: 10_000},
});
// result.point ≈ [2, 2], result.value ≈ 8

Custom constraints (inequality + equality)

$$f(x, y) = (x - 3)^2 + (y - 3)^2$$

  • Goal: minimize
  • Subject to: $x + y \le 4$ (inequality), $x = y$ (equality)
  • Method: quadratic penalty, $\mu = 100,000$
  • Start: $x_0 = (0, 0)$
  • Expected: $\min = 2$ at $(2, 2)$
const fn = (x: Float64Array) => (x[0] - 3) ** 2 + (x[1] - 3) ** 2;

const constraints: singleObjective.Constraint[] = [
  {type: 'ineq', fn: (x) => x[0] + x[1] - 4},  // x + y <= 4
  {type: 'eq',   fn: (x) => x[0] - x[1]},        // x = y
];

const constrained = applyPenalty(fn, constraints, {mu: 100_000});

const nm = new NelderMead();
const result = nm.minimize(constrained, new Float64Array([0, 0]), {
  maxIterations: 10_000,
});
// result.point ≈ [2, 2], result.value ≈ 2

PSO with reproducible seed

Minimize the Rosenbrock function with PSO using a fixed seed for deterministic results:

$$f(x, y) = 100(y - x^2)^2 + (1 - x)^2$$

  • Goal: minimize
  • Start: $x_0 = (-1.2,; 1.0)$
  • Expected: $\min = 0$ at $(1, 1)$
const pso = new PSO();
const result = pso.minimize(rosenbrock, new Float64Array([-1.2, 1.0]), {
  swarmSize: 40,
  maxIterations: 3_000,
  seed: 42,  // deterministic results
});

Async objective function

When the objective function involves asynchronous work (API calls, simulations, file I/O), use minimizeAsync / maximizeAsync:

const asyncObjective = async (x: Float64Array): Promise<number> => {
  // e.g. call a remote simulation service
  return rosenbrock(x);
};

const nm = new NelderMead();
const result = await nm.minimizeAsync(asyncObjective, new Float64Array([-1.2, 1.0]), {
  maxIterations: 5_000,
  tolerance: 1e-12,
});

Async penalty wrappers are also available via applyPenaltyAsync.

Iteration callback

Use onIteration to monitor progress or stop early:

const result = nm.minimize(sphere, new Float64Array([5, -3, 7]), {
  maxIterations: 5_000,
  onIteration: (state) => {
    // Log every 100th iteration
    if (state.iteration % 100 === 0)
      console.log(`iter ${state.iteration}: best = ${state.bestValue}`);

    // Return true to stop early
    if (state.bestValue < 1e-6) return true;
  },
});

Registry

Optimizers can be looked up by name:

console.log(listOptimizers()); // ['nelder-mead', 'pso']

const optimizer = getOptimizer('nelder-mead');
const result = optimizer.minimize(sphere, new Float64Array([5, -3, 7]), {
  maxIterations: 5_000,
});

Benchmarks

Run 15 standard test functions (Sphere, Rosenbrock, Beale, Booth, Matyas, Himmelblau, Three-Hump Camel, Rastrigin, Ackley, Lévi N.13, Griewank, Styblinski-Tang, Easom, Goldstein-Price, McCormick) across all 4 optimizers with default hyperparameters:

npx tsx src/optimization/single-objective/benchmarks/unconstrained-benchmarks.ts

The output is a per-problem comparison table showing found value, distance to optimum, convergence status, iteration count, function evaluations, and wall-clock time.

Find the results in this summary.

Running examples

Runnable examples are located in src/optimization/single-objective/examples/:

# Unconstrained minimize & maximize (Rosenbrock, Sphere, Gaussian)
npx tsx src/optimization/single-objective/examples/unconstrained.ts

# Constrained optimization (box constraints, custom ineq/eq, log-barrier)
npx tsx src/optimization/single-objective/examples/constrained.ts

# Registry usage
npx tsx src/optimization/single-objective/examples/registry.ts

# Async objective functions and onIteration callbacks
npx tsx src/optimization/single-objective/examples/async-and-callbacks.ts