@abdo30004/cryptiq
v1.0.3
Published
High-performance native cryptography library for Node.js — AES-256-GCM, ChaCha20-Poly1305, AES-256-CTR, SHA-256/512, BLAKE3, Argon2, with sync, async, and streaming APIs
Downloads
282
Maintainers
Readme
Cryptiq
High-performance native cryptography library for Node.js, powered by Rust via NAPI-RS.
Features
- Hashing — SHA-256, SHA-512, BLAKE3, MD5 (legacy), Argon2id
- Streaming Hashing — Process large files chunk by chunk without loading them into memory
- AES-256-GCM — Authenticated encryption with automatic nonce management
- ChaCha20-Poly1305 — Authenticated encryption (software-optimized alternative to AES-GCM)
- AES-256-CTR — Stream cipher encryption (no authentication — use with external HMAC)
- Streaming Encryption — Encrypt/decrypt large files chunk by chunk over gRPC streams (all 3 algorithms)
- Async variants — Every function has an async counterpart that runs on the libuv thread pool (non-blocking)
- BLAKE3 Rayon parallelism — Async BLAKE3 hashing uses Rayon for multi-core parallel computation
- Key Derivation — Argon2id password-based key derivation (OWASP recommended parameters)
- Secure Random — Cryptographically secure random bytes, keys, nonces, IVs, and salts
Installation
npm install @abdo30004/cryptiq
# or
yarn add @abdo30004/cryptiqPrebuilt native binaries are provided for Windows (x64), macOS (x64 & ARM64), and Linux (x64). No Rust toolchain required for consumers.
Quick Start
import { aesGcm, chacha, aesCtr, hash, utils } from '@abdo30004/cryptiq'
// --- Hashing ---
const digest = hash.sha256(Buffer.from('hello world'))
const blake = hash.blake3(Buffer.from('hello world'))
// Async hashing (non-blocking)
const asyncDigest = await hash.sha256Async(Buffer.from('hello world'))
const asyncBlake = await hash.blake3Async(Buffer.from('large data...')) // uses Rayon multi-threading
// --- Password hashing ---
const phc = hash.argon2('my-password')
const valid = hash.verifyArgon2('my-password', phc)
// Async (non-blocking — ideal for servers)
const phcAsync = await hash.argon2Async('my-password')
const validAsync = await hash.verifyArgon2Async('my-password', phcAsync)
// --- Encryption ---
const key = utils.generateKey() // 32 random bytes
// AES-256-GCM (sync)
const encrypted = aesGcm.encrypt(Buffer.from('secret data'), key)
const decrypted = aesGcm.decrypt(encrypted, key)
// AES-256-GCM (async — runs on libuv thread pool)
const encryptedAsync = await aesGcm.encryptAsync(Buffer.from('secret data'), key)
const decryptedAsync = await aesGcm.decryptAsync(encryptedAsync, key)
// ChaCha20-Poly1305
const chachaEnc = chacha.encrypt(Buffer.from('secret'), key)
const chachaDec = chacha.decrypt(chachaEnc, key)
// AES-256-CTR (no authentication!)
const ctrEnc = aesCtr.encrypt(Buffer.from('secret'), key)
const ctrDec = aesCtr.decrypt(ctrEnc, key)
// --- Key derivation ---
const { key: derivedKey, salt } = utils.deriveKey('user-password')
const { key: sameKey } = utils.deriveKey('user-password', salt)
// Async key derivation
const { key: asyncKey, salt: asyncSalt } = await utils.deriveKeyAsync('user-password')Namespaced API
All functions are organized into five namespaces:
| Namespace | Contents |
| --------- | ---------------------------------------------------------------------- |
| aesGcm | AES-256-GCM encrypt/decrypt (sync, async, streaming) |
| chacha | ChaCha20-Poly1305 encrypt/decrypt (sync, async, streaming) |
| aesCtr | AES-256-CTR encrypt/decrypt (sync, async, streaming) |
| hash | SHA-256, SHA-512, BLAKE3, MD5, Argon2 (sync, async, streaming hashers) |
| utils | Key generation, nonce/IV/salt generation, key derivation |
// Named imports
import { aesGcm, chacha, aesCtr, hash, utils } from '@abdo30004/cryptiq'
// Default import
import cryptiq from '@abdo30004/cryptiq'
cryptiq.aesGcm.encrypt(data, key)API Reference
Hashing — hash
Sync
| Method | Returns |
| ----------------------------------- | ------------------------------------- |
| hash.sha256(data: Buffer) | string — 64-char hex |
| hash.sha512(data: Buffer) | string — 128-char hex |
| hash.blake3(data: Buffer) | string — 64-char hex |
| hash.md5(data: Buffer) | string — 32-char hex ⚠️ legacy only |
| hash.argon2(password, salt?) | string — PHC format |
| hash.verifyArgon2(password, hash) | boolean |
Async
All async variants run on the libuv thread pool and return a Promise. BLAKE3 async uses Rayon for multi-core parallelism.
| Method | Returns |
| ---------------------------------------- | ---------------------------------- |
| hash.sha256Async(data) | Promise<string> |
| hash.sha512Async(data) | Promise<string> |
| hash.blake3Async(data) | Promise<string> — Rayon parallel |
| hash.md5Async(data) | Promise<string> |
| hash.argon2Async(password, salt?) | Promise<string> |
| hash.verifyArgon2Async(password, hash) | Promise<boolean> |
Streaming Hashers
Process large files chunk by chunk. Available classes: hash.Sha256Hasher, hash.Sha512Hasher, hash.Blake3Hasher, hash.Md5Hasher.
| Method | Description |
| ----------------------------- | -------------------------------------------------- |
| constructor() | Creates a new hasher instance |
| update(chunk: Buffer): void | Feeds a chunk of data |
| digest(): string | Finalizes and returns hex hash (auto-resets) |
| digestBytes(): Buffer | Finalizes and returns raw hash bytes (auto-resets) |
| reset(): void | Discards state and resets |
import { createReadStream } from 'fs'
import { hash } from '@abdo30004/cryptiq'
const hasher = new hash.Sha256Hasher()
const stream = createReadStream('/path/to/large-file', { highWaterMark: 65536 })
for await (const chunk of stream) {
hasher.update(chunk)
}
const fileHash = hasher.digest()Encryption Algorithms
Three algorithms, all sharing a 32-byte key:
| Algorithm | Auth | Overhead/msg | Best For | | --------------------- | ------- | ------------------ | ------------------------------------------ | | AES-256-GCM | ✅ AEAD | 28 B (nonce + tag) | Hardware-accelerated environments (AES-NI) | | ChaCha20-Poly1305 | ✅ AEAD | 28 B (nonce + tag) | Software-only environments, mobile, ARM | | AES-256-CTR | ❌ None | 16 B (IV) | When external HMAC is applied separately |
Recommendation: Use AES-256-GCM or ChaCha20-Poly1305 unless you specifically need unauthenticated encryption with external integrity verification.
AES-256-GCM — aesGcm
Output format: nonce (12 bytes) || ciphertext || auth tag (16 bytes)
| Method | Returns |
| ------------------------------------------------------ | ----------------- |
| aesGcm.encrypt(plaintext, key) | Buffer |
| aesGcm.decrypt(ciphertext, key) | Buffer |
| aesGcm.encryptWithNonce(plaintext, key, nonce) | Buffer |
| aesGcm.decryptWithNonce(ciphertext, key, nonce) | Buffer |
| aesGcm.encryptAsync(plaintext, key) | Promise<Buffer> |
| aesGcm.decryptAsync(ciphertext, key) | Promise<Buffer> |
| aesGcm.encryptWithNonceAsync(plaintext, key, nonce) | Promise<Buffer> |
| aesGcm.decryptWithNonceAsync(ciphertext, key, nonce) | Promise<Buffer> |
| new aesGcm.StreamEncryptor(key) | Stream encryptor |
| new aesGcm.StreamDecryptor(key) | Stream decryptor |
const key = utils.generateKey()
const encrypted = aesGcm.encrypt(Buffer.from('secret'), key)
const decrypted = aesGcm.decrypt(encrypted, key)
// Async (non-blocking)
const enc = await aesGcm.encryptAsync(Buffer.from('secret'), key)
const dec = await aesGcm.decryptAsync(enc, key)ChaCha20-Poly1305 — chacha
Output format: nonce (12 bytes) || ciphertext || auth tag (16 bytes)
| Method | Returns |
| ----------------------------------------------------- | ----------------- |
| chacha.encrypt(data, key) | Buffer |
| chacha.decrypt(encrypted, key) | Buffer |
| chacha.encryptWithNonce(data, key, nonce) | Buffer |
| chacha.decryptWithNonce(encrypted, key, nonce) | Buffer |
| chacha.encryptAsync(data, key) | Promise<Buffer> |
| chacha.decryptAsync(encrypted, key) | Promise<Buffer> |
| chacha.encryptWithNonceAsync(data, key, nonce) | Promise<Buffer> |
| chacha.decryptWithNonceAsync(encrypted, key, nonce) | Promise<Buffer> |
| new chacha.StreamEncryptor(key) | Stream encryptor |
| new chacha.StreamDecryptor(key) | Stream decryptor |
AES-256-CTR — aesCtr
⚠️ No authentication. Ciphertext can be modified without detection.
Output format: iv (16 bytes) || ciphertext
| Method | Returns |
| ----------------------------------------------- | ----------------- |
| aesCtr.encrypt(data, key) | Buffer |
| aesCtr.decrypt(encrypted, key) | Buffer |
| aesCtr.encryptWithIv(data, key, iv) | Buffer |
| aesCtr.decryptWithIv(encrypted, key, iv) | Buffer |
| aesCtr.encryptAsync(data, key) | Promise<Buffer> |
| aesCtr.decryptAsync(encrypted, key) | Promise<Buffer> |
| aesCtr.encryptWithIvAsync(data, key, iv) | Promise<Buffer> |
| aesCtr.decryptWithIvAsync(encrypted, key, iv) | Promise<Buffer> |
| new aesCtr.StreamEncryptor(key) | Stream encryptor |
| new aesCtr.StreamDecryptor(key) | Stream decryptor |
Streaming Encryption
Designed for gRPC bidirectional streaming of large files. Each chunk is independently encrypted with counter-based nonces.
All streaming classes share the same API:
// Encrypt
const encryptor = new aesGcm.StreamEncryptor(key) // 32-byte Buffer
const encrypted = encryptor.encryptChunk(chunk) // Buffer in, Buffer out
encryptor.getChunkIndex() // number of chunks processed
// Decrypt
const decryptor = new aesGcm.StreamDecryptor(key)
const plaintext = decryptor.decryptChunk(encryptedChunk)
decryptor.getChunkIndex()Properties:
- Counter-based nonces/IVs (deterministic — no nonce reuse risk)
- Each output chunk is self-contained
- Supports up to 2³² chunks per stream (~281 TB at 64KB chunks)
- Authenticated streams (GCM/Poly1305) verify nonce sequence — detect reordering/replay
Utilities — utils
| Method | Returns | Description |
| --------------------------------------- | --------------------- | ----------------------------------- |
| utils.generateKey() | Buffer (32 bytes) | Random encryption key |
| utils.generateNonce() | Buffer (12 bytes) | Random nonce for AES-GCM / ChaCha20 |
| utils.generateIv() | Buffer (16 bytes) | Random IV for AES-CTR |
| utils.generateSalt() | Buffer (16 bytes) | Random salt for Argon2 |
| utils.secureRandomBytes(n) | Buffer (n bytes) | General-purpose random bytes |
| utils.deriveKey(password, salt?) | DerivedKey | Argon2id key derivation |
| utils.deriveKeyAsync(password, salt?) | Promise<DerivedKey> | Async key derivation |
interface DerivedKey {
key: Buffer // 32-byte derived key
salt: Buffer // 16-byte salt (store alongside encrypted data)
}Async & Parallelism
Every sync function has an async counterpart (suffixed with Async) that offloads work to the libuv thread pool, keeping the Node.js event loop free.
| Feature | Implementation |
| ------------------------------- | ----------------------------------------------------- |
| Async encryption/decryption | napi-rs AsyncTask on libuv worker threads |
| Async hashing | napi-rs AsyncTask on libuv worker threads |
| BLAKE3 async | Rayon multi-threaded parallelism via update_rayon() |
| Async Argon2 | Offloaded to worker thread — ideal for login flows |
// Non-blocking encryption in an Express/NestJS handler
app.post('/encrypt', async (req, res) => {
const encrypted = await aesGcm.encryptAsync(req.body.data, key)
res.send(encrypted)
})
// Parallel operations with Promise.all
const [enc1, enc2, enc3, enc4] = await Promise.all([
aesGcm.encryptAsync(chunk1, key),
aesGcm.encryptAsync(chunk2, key),
aesGcm.encryptAsync(chunk3, key),
aesGcm.encryptAsync(chunk4, key),
])NestJS gRPC Streaming Example
Proto Definition
service FileTransfer {
rpc Upload(stream FileChunk) returns (UploadResponse);
rpc Download(DownloadRequest) returns (stream FileChunk);
}
message FileChunk {
bytes data = 1; // encrypted chunk
uint32 index = 2; // chunk sequence number
}Upload (Client → Server)
import { createReadStream } from 'fs'
import { aesGcm, hash, utils } from '@abdo30004/cryptiq'
async function uploadFile(client: FileTransferClient, filePath: string) {
const key = utils.generateKey()
const encryptor = new aesGcm.StreamEncryptor(key)
const hasher = new hash.Sha256Hasher()
const stream = createReadStream(filePath, { highWaterMark: 65536 })
const call = client.upload()
for await (const chunk of stream) {
hasher.update(chunk)
const encrypted = encryptor.encryptChunk(chunk)
call.write({ data: encrypted, index: encryptor.getChunkIndex() - 1 })
}
call.end()
const fileHash = hasher.digest()
console.log(`Uploaded ${encryptor.getChunkIndex()} chunks, hash: ${fileHash}`)
}Download (Server → Client)
import { createWriteStream } from 'fs'
import { aesGcm, hash } from '@abdo30004/cryptiq'
async function downloadFile(client: FileTransferClient, fileId: string, key: Buffer, outputPath: string) {
const decryptor = new aesGcm.StreamDecryptor(key)
const hasher = new hash.Sha256Hasher()
const writeStream = createWriteStream(outputPath)
const call = client.download({ fileId })
for await (const message of call) {
const plaintext = decryptor.decryptChunk(message.data)
hasher.update(plaintext)
writeStream.write(plaintext)
}
writeStream.end()
const fileHash = hasher.digest()
console.log(`Downloaded ${decryptor.getChunkIndex()} chunks, hash: ${fileHash}`)
}Chunk Wire Formats
AES-256-GCM / ChaCha20-Poly1305
┌──────────────┬────────────────────────┬──────────────────┐
│ Nonce (12B) │ Ciphertext (variable) │ Auth Tag (16B) │
└──────────────┴────────────────────────┴──────────────────┘Overhead: 28 bytes per chunk (0.04% at 64KB chunks).
AES-256-CTR
┌────────────┬────────────────────────┐
│ IV (16B) │ Ciphertext (variable) │
└────────────┴────────────────────────┘Overhead: 16 bytes per chunk. No authentication tag.
Security Notes
| Topic | Detail |
| --------------------- | --------------------------------------------------------------------------------------------------------- |
| AES-256-GCM | NIST-approved AEAD. Confidentiality + integrity. Hardware-accelerated via AES-NI. |
| ChaCha20-Poly1305 | IETF RFC 8439 AEAD. Constant-time in software. Preferred when AES-NI is unavailable. |
| AES-256-CTR | ⚠️ No authentication. Ciphertext is malleable. Use only with external HMAC. |
| Counter nonces | Streaming uses deterministic counter nonces — no nonce reuse risk. Max 2³² chunks per key. |
| Argon2id | OWASP-recommended password hashing. Resistant to GPU + side-channel attacks. 19 MiB memory, 2 iterations. |
| MD5 | ⚠️ Cryptographically broken. Legacy checksum compatibility only. |
| Random generation | OS CSPRNG (OsRng) — cryptographically secure on all platforms. |
| Memory safety | Written in Rust — no buffer overflows, use-after-free, or data races. |
Build from Source
yarn install # Install dependencies
yarn build # Build release binary
yarn test # Run test suite (75 tests)
yarn bench # Run benchmarks
yarn lint # Run linter
cargo fmt -- --check # Check Rust formattingSupported Platforms
| Platform | Architecture | Package |
| -------- | --------------------- | ----------------------------------- |
| Windows | x64 | @abdo30004/cryptiq-win32-x64-msvc |
| macOS | x64 | @abdo30004/cryptiq-darwin-x64 |
| macOS | ARM64 (Apple Silicon) | @abdo30004/cryptiq-darwin-arm64 |
| Linux | x64 (glibc) | @abdo30004/cryptiq-linux-x64-gnu |
License
MIT
