@thewoowon/vecx
v0.1.0
Published
Ultra-light Vector Similarity Engine for Browser & Node.js
Maintainers
Readme
vecx
Ultra-light Vector Similarity Engine for Browser & Node.js
Zero dependencies. Tree-shakable. TypeScript native.
Why vecx?
Most vector/math libraries are too heavy for modern web applications. vecx is designed specifically for embedding similarity use cases in the AI era:
- ✨ Zero dependencies - No bloat, just pure math
- 📦 Tree-shakable - Import only what you need
- 🚀 Fast - Optimized for JIT compilation
- 🌐 Universal - Works everywhere: Browser, Node.js, React Native, Deno, Cloudflare Workers
- 📘 TypeScript native - Full type safety out of the box
- 🪶 Ultra-light - < 2KB minified + gzipped
Installation
npm install @thewoowon/vecxyarn add @thewoowon/vecxpnpm add @thewoowon/vecxQuick Start
import {
dot,
l2Norm,
cosineSimilarity,
normalize,
topKByCosine,
} from '@thewoowon/vecx';
// Basic operations
const a = [0.1, 0.2, 0.3];
const b = [0.05, 0.25, 0.35];
dot(a, b); // 0.145
l2Norm(a); // 0.374...
cosineSimilarity(a, b); // 0.974... (between -1 and 1)
normalize(a); // Float32Array [0.267, 0.534, 0.801]
// Top-K search
const query = [0.1, 0.2, 0.3];
const vectors = [
[0.2, 0.1, 0.0],
[0.05, 0.19, 0.3],
[0.3, 0.3, 0.3],
];
const { indices, scores } = topKByCosine(query, vectors, 2);
// indices: [1, 2] - indices of most similar vectors
// scores: [0.998, 0.987] - cosine similarity scoresAPI Reference
Vector Types
type Vector = number[] | Float32Array | Float64Array;All functions accept regular arrays or typed arrays for maximum flexibility.
Basic Operations
dot(a: Vector, b: Vector): number
Computes the dot product of two vectors.
dot([1, 2, 3], [4, 5, 6]); // 32l2Norm(a: Vector): number
Computes the L2 (Euclidean) norm of a vector.
l2Norm([3, 4]); // 5cosineSimilarity(a: Vector, b: Vector): number
Computes cosine similarity between two vectors. Returns a value between -1 and 1.
cosineSimilarity([1, 2, 3], [1, 2, 3]); // 1.0 (identical)
cosineSimilarity([1, 0], [0, 1]); // 0.0 (orthogonal)
cosineSimilarity([1, 2], [-1, -2]); // -1.0 (opposite)normalize(a: Vector): Float32Array
Normalizes a vector to unit length (L2 norm = 1). Returns a new Float32Array.
const normalized = normalize([3, 4]);
l2Norm(normalized); // 1.0Search Operations
topKByCosine(query: Vector, corpus: Vector[], k: number): TopKResult
Finds the top-K vectors with highest cosine similarity to the query.
const query = [0.1, 0.2, 0.3];
const corpus = [
[0.2, 0.1, 0.0],
[0.05, 0.19, 0.3],
[0.3, 0.3, 0.3],
];
const { indices, scores } = topKByCosine(query, corpus, 2);
// indices: [1, 2] - corpus indices of top matches
// scores: [0.998, 0.987] - similarity scorestopKByDot(query: Vector, corpus: Vector[], k: number): TopKResult
Finds the top-K vectors with highest dot product to the query.
const { indices, scores } = topKByDot(query, corpus, 2);Real-World Examples
Job Matching by Skills
import { topKByCosine } from '@thewoowon/vecx';
// Job seeker skills (embedded as vectors)
const candidateVector = [0.8, 0.6, 0.3, 0.9, 0.2];
// Job posting embeddings
const jobVectors = [
[0.7, 0.5, 0.4, 0.8, 0.1], // Backend Engineer
[0.9, 0.7, 0.2, 0.9, 0.3], // Full-stack Developer
[0.2, 0.3, 0.9, 0.1, 0.8], // Data Scientist
];
const { indices, scores } = topKByCosine(candidateVector, jobVectors, 3);
console.log('Best job matches:', indices.map((i) => jobVectors[i]));Recommendation System
import { normalize, topKByDot } from '@thewoowon/vecx';
// User preference vector
const userPrefs = normalize([4, 5, 2, 3, 5]);
// Product embeddings
const products = [
normalize([4, 4, 3, 3, 4]),
normalize([5, 5, 1, 2, 5]),
normalize([2, 3, 5, 4, 2]),
];
const { indices } = topKByDot(userPrefs, products, 2);
console.log('Recommended products:', indices);Semantic Search
import { cosineSimilarity } from '@thewoowon/vecx';
// Text embeddings (from OpenAI, Cohere, etc.)
const queryEmbedding = getEmbedding("machine learning tutorial");
const docEmbeddings = docs.map(doc => getEmbedding(doc.text));
const similarities = docEmbeddings.map(docEmb =>
cosineSimilarity(queryEmbedding, docEmb)
);
const bestMatch = docs[similarities.indexOf(Math.max(...similarities))];Platform Support
vecx works seamlessly across all JavaScript environments:
| Platform | Status | Notes | |----------|--------|-------| | Browser | ✅ | Vanilla JS, React, Vue, Svelte | | Node.js | ✅ | v14+ | | React Native | ✅ | Including Hermes engine | | Deno | ✅ | ESM imports | | Bun | ✅ | Native ESM | | Cloudflare Workers | ✅ | Edge computing | | Electron | ✅ | Desktop apps |
Performance
vecx is optimized for real-world embedding dimensions (768-1536):
Benchmark: 10,000 vectors × 768 dimensions
cosineSimilarity: ~0.05ms per comparison
topKByCosine: ~500ms for k=10 (full scan)
normalize: ~0.02ms per vectorBenchmarks run on Apple M1, Node.js v20
Why not mathjs / numeric.js / ml-matrix?
These are excellent libraries, but:
- Too heavy: 100KB+ after minification
- Too generic: Not optimized for embedding similarity
- Tree-shaking issues: Hard to import just one function
vecx is laser-focused on vector similarity for AI applications.
Roadmap
- ✅ v0.1: Core operations + top-K search
- 🚧 v0.2: Batch operations (
cosineMatrix,topKBatch) - 📋 v0.3: Optimized top-K with quickselect (O(n) expected)
- 📋 v0.4: SIMD optimizations for supported platforms
Contributing
Contributions are welcome! Please open an issue or PR.
License
MIT © thewoowon
Links
Built for the AI era 🤖
