@techbulls/encrypted-s3-store
v1.0.6
Published
Encrypt and decrypt S3 objects with ease
Readme
encrypted-s3-store
A TypeScript library that provides transparent end-to-end encryption for AWS S3 objects using AES-256-GCM encryption with scrypt key derivation.
Features
- AES-256-GCM Encryption - Military-grade encryption for all stored objects
- Custom Encryption Support - Bring your own encryption/decryption functions
- Configurable Modes - Choose between
encrypt(with encryption) orpassthrough(without encryption) - Scrypt Key Derivation - Secure memory-hard key derivation using Node.js built-in crypto
- Transparent Encryption/Decryption - Automatic handling during upload and download
- Full AWS SDK Compatibility - Wraps the official AWS S3 client
- TypeScript Support - Complete type definitions with strict mode
- Zero External Dependencies - Uses only Node.js built-in crypto modules for encryption
Installation
npm install @techbulls/encrypted-s3-storeor
pnpm install @techbulls/encrypted-s3-storeor
yarn add @techbulls/encrypted-s3-storeQuick Start
import { S3ObjectStore, S3Client } from '@techbulls/encrypted-s3-store';
// Create an AWS S3 client
const s3Client = new S3Client({ region: 'us-east-1' });
// Create the encrypted store wrapper
const store = new S3ObjectStore(s3Client, 'your-secret-encryption-key');
// Upload encrypted data
await store.upload({
Bucket: 'my-secure-bucket',
Key: 'secret-file.txt',
Body: 'This content will be encrypted automatically',
ContentType: 'text/plain',
});
// Download and decrypt data
const result = await store.download({
Bucket: 'my-secure-bucket',
Key: 'secret-file.txt',
});
// result.Body is a readable stream with decrypted contentConfiguration
Constructor Parameters
The S3ObjectStore constructor accepts the following parameters:
| Parameter | Type | Required | Description |
| ------------------ | ------------------ | -------- | --------------------------------------------------------------------- |
| client | S3Client | Yes | An initialized AWS S3Client instance |
| key | string | Yes* | Encryption key for AES-256-GCM or custom encryption |
| mode | ModeType | No | Operation mode ('encrypt' or 'passthrough'). Default: 'encrypt' |
| customEncryption | CustomEncryption | No | Custom encrypt/decrypt functions to replace default AES-256-GCM |
*Required when using encrypt mode. Can also be provided via ENCRYPTION_KEY environment variable.
Modes
encrypt(default): Data is encrypted before upload and decrypted on download using AES-256-GCM.passthrough: Data is uploaded and downloaded without any encryption. Useful for non-sensitive data or when using S3's built-in encryption.
Environment Variables
The library supports the following environment variables:
| Variable | Description |
| ---------------- | ------------------------------------------------- |
| ENCRYPTION_KEY | Fallback encryption key if not provided in config |
API Reference
upload(input)
Encrypts and uploads data to S3.
await store.upload({
Bucket: 'my-bucket',
Key: 'path/to/file.txt',
Body: 'content to encrypt', // string, Buffer, or Uint8Array
ContentType: 'text/plain',
});
// Or with per-operation mode override
await store.upload({
Bucket: 'my-bucket',
Key: 'file.txt',
Body: 'content',
mode: 'passthrough', // Skip encryption for this upload
});Parameters:
input-IUploadobject (extendsPutObjectCommandInput)Bucket(required) - S3 bucket nameKey(required) - S3 object keyBody(required) - Content to encrypt and upload (string, Buffer, or Uint8Array)mode(optional) - Per-operation mode override ('encrypt'or'passthrough')- All other
PutObjectCommandInputoptions are supported
Returns: PutObjectCommandOutput from AWS SDK
download(input)
Downloads and decrypts data from S3.
const result = await store.download({
Bucket: 'my-bucket',
Key: 'path/to/file.txt',
});
// Or with per-operation mode override
const result = await store.download({
Bucket: 'my-bucket',
Key: 'file.txt',
mode: 'passthrough', // Skip decryption for this download
});
// result.Body is a Readable stream with decrypted contentParameters:
input-IDownloadobject (extendsGetObjectCommandInput)Bucket(required) - S3 bucket nameKey(required) - S3 object keymode(optional) - Per-operation mode override ('encrypt'or'passthrough')- All other
GetObjectCommandInputoptions are supported
Returns:
Body- Decrypted readable streamContentType- Original content typeContentLength- Content lengthMetadata- S3 object metadata
Examples
Basic Usage
import { S3ObjectStore, S3Client } from '@techbulls/encrypted-s3-store';
const s3Client = new S3Client({ region: 'us-west-2' });
const store = new S3ObjectStore(s3Client, process.env.ENCRYPTION_KEY!);
// Upload a JSON object
const data = { userId: 123, sensitiveInfo: 'secret' };
await store.upload({
Bucket: 'my-bucket',
Key: 'users/123/data.json',
Body: JSON.stringify(data),
ContentType: 'application/json',
});Reading Downloaded Content
import { Readable } from 'stream';
const result = await store.download({
Bucket: 'my-bucket',
Key: 'users/123/data.json',
});
// Convert stream to string
const chunks: Buffer[] = [];
for await (const chunk of result.Body as Readable) {
chunks.push(Buffer.from(chunk));
}
const content = Buffer.concat(chunks).toString('utf-8');
const data = JSON.parse(content);Using with Buffer
// Upload binary data
const imageBuffer = fs.readFileSync('photo.jpg');
await store.upload({
Bucket: 'my-bucket',
Key: 'images/photo.jpg',
Body: imageBuffer,
ContentType: 'image/jpeg',
});Passthrough Mode (No Encryption)
// Use passthrough mode for non-sensitive data
const s3Client = new S3Client({ region: 'us-west-2' });
const store = new S3ObjectStore(s3Client, 'unused-key', 'passthrough');
// Data is uploaded without encryption
await store.upload({
Bucket: 'my-public-bucket',
Key: 'public/readme.txt',
Body: 'This content is not encrypted',
ContentType: 'text/plain',
});Per-Operation Mode Override
const store = new S3ObjectStore(s3Client, 'my-secret-key'); // Default: encrypt
// This upload will skip encryption
await store.upload({
Bucket: 'my-bucket',
Key: 'public-file.txt',
Body: 'Not sensitive',
mode: 'passthrough',
});
// This upload will use encryption (default)
await store.upload({
Bucket: 'my-bucket',
Key: 'secret-file.txt',
Body: 'Sensitive data',
});
// Download without decryption
const result = await store.download({
Bucket: 'my-bucket',
Key: 'unencrypted-file.txt',
mode: 'passthrough',
});Custom Encryption
You can provide your own encryption/decryption functions to replace the default AES-256-GCM implementation:
import { S3ObjectStore, CustomEncryption } from '@techbulls/encrypted-s3-store';
import { createCipheriv, createDecipheriv, randomBytes } from 'crypto';
const customEncryption: CustomEncryption = {
encrypt: (data: Buffer, key: string) => {
// Your custom encryption logic
const iv = randomBytes(16);
const cipher = createCipheriv('aes-256-cbc', Buffer.from(key.padEnd(32)), iv);
const encrypted = Buffer.concat([cipher.update(data), cipher.final()]);
return {
data: encrypted,
metadata: {
iv: iv.toString('base64'),
algorithm: 'aes-256-cbc',
},
};
},
decrypt: (data: Buffer, key: string, metadata) => {
// Your custom decryption logic using the metadata
const iv = Buffer.from(metadata.iv, 'base64');
const decipher = createDecipheriv('aes-256-cbc', Buffer.from(key.padEnd(32)), iv);
return Buffer.concat([decipher.update(data), decipher.final()]);
},
};
const store = new S3ObjectStore(s3Client, 'my-secret-key', 'encrypt', customEncryption);
// Upload using custom encryption
await store.upload({
Bucket: 'my-bucket',
Key: 'file.txt',
Body: 'Sensitive data',
});
// Download using custom decryption
const result = await store.download({
Bucket: 'my-bucket',
Key: 'file.txt',
});Custom Encryption Interface:
interface CustomEncryption {
encrypt: (data: Buffer, key: string) => EncryptResult | Promise<EncryptResult>;
decrypt: (data: Buffer, key: string, metadata: EncryptionMetadata) => Buffer | Promise<Buffer>;
}
interface EncryptResult {
data: Buffer; // The encrypted data
metadata: EncryptionMetadata; // Metadata to store for decryption
}
interface EncryptionMetadata {
[key: string]: string; // Key-value pairs stored in S3 object metadata
}Notes:
- Both
encryptanddecryptfunctions must be provided together - Metadata keys are automatically prefixed with
x-amz-custom-when stored in S3 - Custom encryption uses
x-amz-encryption: 'custom'as the marker in S3 metadata - Functions can be synchronous or asynchronous (return Promise)
Environment-based Configuration
// Set environment variable:
// ENCRYPTION_KEY=your-secret-key
const s3Client = new S3Client({ region: 'us-east-1' });
const store = new S3ObjectStore(s3Client);
// encryptionKey will be read from ENCRYPTION_KEY env varSecurity
Encryption Details
- Algorithm: AES-256-GCM (Advanced Encryption Standard, 256-bit key, Galois/Counter Mode)
- IV: Random 12-byte initialization vector per upload
- Authentication: GCM authentication tag for integrity verification
Key Derivation
Uses scrypt with the following parameters:
- N (cost): 16384 (2^14)
- r (blockSize): 8
- p (parallelization): 1
- Output: 32 bytes (256-bit key)
Metadata Storage
Encryption metadata is stored in S3 object metadata:
Default AES-256-GCM:
x-amz-iv- Base64-encoded initialization vectorx-amz-salt- Base64-encoded salt for key derivationx-amz-auth-tag- Base64-encoded authentication tagx-amz-encryption- Set toaes-256-gcm
Custom Encryption:
x-amz-custom-*- Custom metadata keys (prefixed automatically)x-amz-encryption- Set tocustom
Error Handling
The library throws descriptive errors for common issues:
try {
await store.upload({ Bucket: 'my-bucket', Key: 'test.txt', Body: 'content' });
} catch (error) {
// Handle errors:
// - "Encryption key not provided."
// - "Body is required for upload"
// - "No body returned from S3"
}TypeScript
The library is written in TypeScript and exports all types:
import { S3ObjectStore } from '@techbulls/encrypted-s3-store';
import type {
ModeType,
IUpload,
IDownload,
CustomEncryption,
EncryptFunction,
DecryptFunction,
EncryptResult,
EncryptionMetadata,
} from '@techbulls/encrypted-s3-store';
import type { PutObjectCommandInput, GetObjectCommandInput } from '@techbulls/encrypted-s3-store';Using the Underlying S3 Client
The S3ObjectStore wraps an S3 client, which you can access for standard S3 operations:
import { S3Client, ListObjectsV2Command, DeleteObjectCommand } from '@techbulls/encrypted-s3-store';
const s3Client = new S3Client({ region: 'us-east-1' });
const store = new S3ObjectStore(s3Client, 'my-key');
// List objects (use the underlying client directly)
const listResult = await store.client.send(
new ListObjectsV2Command({
Bucket: 'my-bucket',
Prefix: 'users/',
})
);
// Delete object
await store.client.send(
new DeleteObjectCommand({
Bucket: 'my-bucket',
Key: 'old-file.txt',
})
);Requirements
- Node.js 18+
- AWS credentials with S3 access
Author
Aditya Chaphekar - [email protected]
License
Licensed under the MIT License. See LICENSE.md for details.
