@hajewski/latticedb
v0.9.6
Published
TypeScript bindings for Lattice embedded property-graph database with vector and BM25 full-text search
Maintainers
Readme
LatticeDB TypeScript Bindings
TypeScript/Node.js bindings for LatticeDB, an embedded single-file property-graph database with native vector and BM25 full-text search.
Installation
npm install @hajewski/latticedbPublished package tarballs are expected to bundle the native shared library for supported platforms.
If you are working from a source checkout, stage the native library into the package with:
export LATTICE_BUNDLE_LIB_DIR=/tmp/lattice-install/lib
npm run bundle:nativeIf LATTICE_BUNDLE_LIB_DIR / LATTICE_BUNDLE_LIB_PATH is not set, npm run bundle:native will build the current platform library with Zig.
At runtime, explicit library discovery overrides still work via LATTICE_LIB_PATH, LATTICE_PREFIX, and pkg-config.
Migration note: embedding helpers now live in the dedicated @hajewski/latticedb/embedding entrypoint. See ../../docs/client_api_migration.md for the preferred API names and deprecated compatibility aliases.
Installed-prefix workflow:
zig build install --prefix /tmp/lattice-install
export LATTICE_PREFIX=/tmp/lattice-installAlternatively, discovery can use pkg-config:
export PKG_CONFIG_PATH=/tmp/lattice-install/lib/pkgconfigQuick Start
import { Database } from "@hajewski/latticedb";
const db = new Database("knowledge.db", {
create: true,
enableVectors: true,
vectorDimensions: 4,
});
await db.open();
// Create nodes, edges, and index content
await db.write(async (txn) => {
const alice = await txn.createNode({
labels: ["Person"],
properties: { name: "Alice", age: 30 },
});
const bob = await txn.createNode({
labels: ["Person"],
properties: { name: "Bob", age: 25 },
});
await txn.createEdge(alice.id, bob.id, "KNOWS");
// Index text for full-text search
await txn.ftsIndex(alice.id, "Alice works on machine learning research");
await txn.ftsIndex(bob.id, "Bob studies deep learning and neural networks");
// Store vector embeddings
await txn.setVector(
alice.id,
"embedding",
new Float32Array([1.0, 0.0, 0.0, 0.0])
);
await txn.setVector(
bob.id,
"embedding",
new Float32Array([0.0, 1.0, 0.0, 0.0])
);
});
// Query with Cypher
const result = await db.query(
"MATCH (n:Person) WHERE n.age > 20 RETURN n.name, n.age"
);
for (const row of result.rows) {
console.log(row);
}
// Vector similarity search
const results = await db.vectorSearch(
new Float32Array([0.9, 0.1, 0.0, 0.0]),
{ k: 2 }
);
for (const r of results) {
console.log(`Node ${r.nodeId}: distance=${r.distance.toFixed(4)}`);
}
// Full-text search
const ftsResults = await db.ftsSearch("machine learning");
for (const r of ftsResults) {
console.log(`Node ${r.nodeId}: score=${r.score.toFixed(4)}`);
}
// Fuzzy search (typo-tolerant)
const fuzzyResults = await db.ftsSearchFuzzy("machin lerning");
for (const r of fuzzyResults) {
console.log(`Node ${r.nodeId}: score=${r.score.toFixed(4)}`);
}
await db.close();Features
- Property Graph - Nodes and edges with labels and properties
- Vector Search - HNSW-based k-NN search for embeddings
- Full-Text Search - BM25-ranked search with tokenization
- Fuzzy Search - Typo-tolerant full-text search with configurable edit distance
- Bulk Vector Insertion - Efficient insertion of vector-bearing nodes
- Embeddings - Built-in hash embeddings and HTTP client for external services
- Cypher Queries - Pattern matching with
<=>(vector) and@@(FTS) extensions - Transactions - ACID-compliant read/write transactions
- Query Cache - Automatic caching of parsed queries
- TypeScript - Full type definitions included
API Reference
Database
const db = new Database(path: string, options?: DatabaseOptions);
interface DatabaseOptions {
create?: boolean; // Create if not exists (default: false)
readOnly?: boolean; // Open read-only (default: false)
cacheSizeMb?: number; // Cache size in MB (default: 100)
enableVectors?: boolean; // Preferred vector config flag
enableVector?: boolean; // Deprecated compatibility alias
vectorDimensions?: number; // Vector dimensions (default: 128)
}Methods
await db.open()- Open the database connectionawait db.close()- Close the database connectionawait db.read(fn)- Execute a read-only transactionawait db.write(fn)- Execute a read-write transactionawait db.query(cypher, params?)- Execute a Cypher queryawait db.vectorSearch(vector, options?)- k-NN vector searchawait db.ftsSearch(query, options?)- Full-text searchawait db.ftsSearchFuzzy(query, options?)- Fuzzy full-text searchawait db.readStream(stream, options?)- Read durable stream records by cursorawait db.getStreamOffset(stream, consumer)- Read a committed consumer offsetawait db.changes(options?)- Read the built-in graph changefeedawait db.cacheClear()- Clear the query cacheawait db.cacheStats()- Get cache hit/miss statisticsdb.isOpen()- Check if database is opendb.getPath()- Get database file path
Transaction
Read Operations
await txn.getNode(nodeId)- Get a node by ID, returnsNodeornullawait txn.nodeExists(nodeId)- Check if a node existsawait txn.getProperty(nodeId, key)- Get a property valueawait txn.getOutgoingEdges(nodeId)- Get outgoing edges from a nodeawait txn.getIncomingEdges(nodeId)- Get incoming edges to a nodetxn.isReadOnly()/txn.isActive()- Transaction state
Write Operations
await txn.createNode({ labels, properties })- Create a nodeawait txn.deleteNode(nodeId)- Delete a nodeawait txn.setProperty(nodeId, key, value)- Set a propertyawait txn.setVector(nodeId, key, vector)- Set a vector embeddingawait txn.batchInsertVectors(label, vectors)- Insert vector-bearing nodes in one callawait txn.batchInsert(label, vectors)- Deprecated compatibility alias forbatchInsertVectorsawait txn.ftsIndex(nodeId, text)- Index text for full-text searchawait txn.createEdge(sourceId, targetId, edgeType, options?)- Create an edgeawait txn.deleteEdge(sourceId, targetId, edgeType)- Delete an edgeawait txn.setEdgeProperty(edgeId, key, value)- Set an edge property by stable edge IDawait txn.getEdgeProperty(edgeId, key)- Get an edge property by stable edge IDawait txn.removeEdgeProperty(edgeId, key)- Remove an edge property by stable edge IDtxn.publishStream(stream, payload, kind?)- Publish a durable stream recordtxn.setStreamOffset(stream, consumer, sequence)- Commit a durable consumer offsettxn.trimStream(stream, throughSequence)- Delete stream records through a sequencetxn.commit()/txn.rollback()- Commit or rollback
Bulk Vector Insertion
Insert many nodes with vectors in a single efficient call:
import { Database } from "@hajewski/latticedb";
const db = new Database("vectors.db", {
create: true,
enableVectors: true,
vectorDimensions: 128,
});
await db.open();
await db.write(async (txn) => {
const vectors = Array.from({ length: 1000 }, () =>
Float32Array.from({ length: 128 }, () => Math.random())
);
const nodeIds = await txn.batchInsertVectors("Document", vectors);
console.log(`Created ${nodeIds.length} nodes`);
});
await db.close();Full-Text Search
Exact Search
const results = await db.ftsSearch("machine learning", { limit: 10 });
for (const r of results) {
console.log(`Node ${r.nodeId}: score=${r.score.toFixed(4)}`);
}Fuzzy Search (Typo-Tolerant)
// Finds "machine learning" even with typos
const results = await db.ftsSearchFuzzy("machne lerning", { limit: 10 });
// Control fuzzy matching sensitivity
const precise = await db.ftsSearchFuzzy("machne", {
limit: 10,
maxDistance: 2, // Max edit distance (default: 0 = auto)
minTermLength: 4, // Min term length for fuzzy matching (default: 0 = auto)
});Embeddings
LatticeDB includes a built-in hash embedding function and an HTTP client for external embedding services. For new code, prefer the dedicated @hajewski/latticedb/embedding entrypoint. The package root still exposes deprecated compatibility aliases.
Hash Embeddings (Built-in)
Deterministic, no external service needed. Useful for testing or simple keyword-based similarity:
import { hashEmbed } from "@hajewski/latticedb/embedding";
const vec = hashEmbed("hello world", 128);
console.log(vec.length); // 128HTTP Embedding Client
Connect to Ollama, OpenAI, or compatible APIs:
import { EmbeddingClient, EmbeddingApiFormat } from "@hajewski/latticedb/embedding";
// Ollama (default)
const client = new EmbeddingClient({
endpoint: "http://localhost:11434",
});
const vec = client.embed("hello world");
client.close();
// OpenAI-compatible API
const openaiClient = new EmbeddingClient({
endpoint: "https://api.openai.com/v1",
model: "text-embedding-3-small",
apiFormat: EmbeddingApiFormat.OpenAI,
apiKey: "sk-...",
});
const embedding = openaiClient.embed("hello world");
openaiClient.close();Edge Traversal
await db.read(async (txn) => {
const outgoing = await txn.getOutgoingEdges(nodeId);
for (const edge of outgoing) {
console.log(`${edge.sourceId} --[${edge.type}]--> ${edge.targetId}`);
}
const incoming = await txn.getIncomingEdges(nodeId);
for (const edge of incoming) {
console.log(`${edge.sourceId} --[${edge.type}]--> ${edge.targetId}`);
}
});Cypher Queries
// Pattern matching
const result = await db.query("MATCH (n:Person) RETURN n.name");
// With parameters
const result = await db.query(
"MATCH (n:Person) WHERE n.name = $name RETURN n",
{ name: "Alice" }
);
// Vector similarity in Cypher
const result = await db.query(
"MATCH (n:Document) WHERE n.embedding <=> $vec < 0.5 RETURN n.title",
{ vec: new Float32Array([0.1, 0.2, 0.3, 0.4]) }
);
// Full-text search in Cypher
const result = await db.query(
'MATCH (n:Document) WHERE n.content @@ "machine learning" RETURN n.title'
);
// Data mutation
await db.query('CREATE (n:Person {name: "Charlie", age: 35})');
await db.query('MATCH (n:Person {name: "Charlie"}) SET n.age = 36');
await db.query('MATCH (n:Person {name: "Charlie"}) DETACH DELETE n');Query Cache
// Get cache statistics
const stats = await db.cacheStats();
console.log(
`Entries: ${stats.entries}, Hits: ${stats.hits}, Misses: ${stats.misses}`
);
// Clear the cache
await db.cacheClear();Durable Streams and Changefeeds
Streams are durable named event logs stored inside the database file. Records are published in write transactions, sequence numbers are per stream, and reads use an explicit cursor. Reads do not acknowledge records; commit offsets separately when your consumer has processed a batch.
const db = new Database("events.db", { create: true });
await db.open();
await db.write(async (txn) => {
txn.publishStream("jobs", { id: 1, status: "queued" }, "job.queued");
});
const records = await db.readStream("jobs", {
afterSequence: 0n,
limit: 100,
timeoutMs: 0,
});
await db.write(async (txn) => {
txn.setStreamOffset("jobs", "worker-a", records.at(-1)!.sequence);
txn.trimStream("jobs", records.at(-1)!.sequence - 1n);
});db.changes() reads the reserved __lattice_changes stream. It emits semantic
graph events such as node.insert, node.property_set, edge.delete, and
edge.property_remove, with payloads represented as normal TypeScript values.
Supported Property Types
null- Null valueboolean- Booleannumber- Integer or floatstring- UTF-8 stringUint8Array- Binary dataFloat32Array- Vector embeddings
Nested arrays/objects are not currently exposed by the public bindings/C API.
Error Handling
import { Database, isLibraryAvailable } from "@hajewski/latticedb";
// Check if native library is available
if (!isLibraryAvailable()) {
console.error("LatticeDB native library not found");
process.exit(1);
}
try {
const db = new Database("test.db", { create: true });
await db.open();
// ...
await db.close();
} catch (error) {
console.error("Database error:", error);
}Building from Source
Requires Node.js 18+ and the LatticeDB native library.
# From the latticedb root directory
zig build shared
# Build the TypeScript bindings
cd bindings/typescript
npm install
npm run build
# Run tests
npm testRequirements
- Node.js 18+
- The native LatticeDB library (
liblattice.dylib/liblattice.so)
License
MIT
