@akadenia/azure-storage
v2.6.0
Published
Microsoft Azure storage helper methods
Maintainers
Readme
@akadenia/azure-storage
A comprehensive TypeScript library providing simplified helper methods for Microsoft Azure Storage services including Blob Storage, Table Storage, and Queue Storage.
Documentation • GitHub • Issues
Features
- Blob Storage: Upload, download, list, and manage blobs with SAS URL support
- Table Storage: Complete CRUD operations for Azure Table Storage
- Queue Storage: Send, receive, and manage queue messages
- Managed Identity Support: Passwordless authentication using Azure Managed Identity
- TypeScript Support: Full type definitions included
- SAS URL Support: Generate and use Shared Access Signature URLs for secure access
- Error Handling: Built-in error handling and validation
Table of Contents
- Installation
- Quick Start
- Blob Storage
- Table Storage
- Queue Storage
- Configuration
- Error Handling
- Best Practices
- API Reference
- Contributing
- License
Installation
npm install @akadenia/azure-storage --saveQuick Start
Prerequisites
You can authenticate using either:
- Connection String: Get this from the Azure Portal under your storage account's "Access keys" section
- Managed Identity: For applications running in Azure (recommended for production)
import { BlobStorage, TableStorage, QueueStorage } from '@akadenia/azure-storage';
// Option 1: Using connection string
const connectionString = "DefaultEndpointsProtocol=https;AccountName=yourstorageaccount;AccountKey=yourkey;EndpointSuffix=core.windows.net";
const blobStorage = new BlobStorage(connectionString);
const tableStorage = new TableStorage(connectionString, 'tableName');
const queueStorage = new QueueStorage(connectionString);
// Option 2: Using managed identity (recommended for production)
const blobStorage = new BlobStorage({
accountName: 'yourstorageaccount'
});
const tableStorage = new TableStorage({
accountName: 'yourstorageaccount',
tableName: 'tableName'
});
const queueStorage = new QueueStorage({
accountName: 'yourstorageaccount'
});Blob Storage
The BlobStorage class provides methods to interact with Azure Blob Storage.
Basic Setup
import { BlobStorage } from '@akadenia/azure-storage';
const blobStorage = new BlobStorage(connectionString);Container Operations
// Create a container
const containerCreated = await blobStorage.createContainer('my-container');
console.log('Container created:', containerCreated);
// Delete a container
const containerDeleted = await blobStorage.deleteContainer('my-container');
console.log('Container deleted:', containerDeleted);Upload Operations
// Upload data from Buffer
const data = Buffer.from('Hello, Azure Blob Storage!');
const uploaded = await blobStorage.uploadData(
'my-container',
'my-blob.txt',
data,
{ blobContentType: 'text/plain' }
);
console.log('Upload successful:', uploaded);
// Upload from stream
import { Readable } from 'stream';
const stream = Readable.from(['Stream data content']);
const streamUploaded = await blobStorage.uploadStream(
'my-container',
'stream-blob.txt',
stream,
{ blobContentType: 'text/plain' }
);
// Upload with custom headers
const uploadedWithHeaders = await blobStorage.uploadData(
'my-container',
'document.pdf',
pdfBuffer,
{
blobContentType: 'application/pdf',
blobCacheControl: 'max-age=3600',
blobContentEncoding: 'gzip'
}
);Download Operations
// Download blob as Buffer
const blobData = await blobStorage.downloadBlob('my-container', 'my-blob.txt');
console.log('Downloaded content:', blobData.toString());
// Check if blob exists
const exists = await blobStorage.blobExists('my-container', 'my-blob.txt');
console.log('Blob exists:', exists);List Operations
// List all blobs with a prefix
const blobs = await blobStorage.listBlobs('my-container', 'documents/');
blobs.forEach(blob => {
console.log(`Blob: ${blob.name}, Size: ${blob.properties.contentLength}`);
});Delete Operations
// Delete a blob
const deleted = await blobStorage.deleteBlob('my-container', 'my-blob.txt');
console.log('Blob deleted:', deleted);SAS URL Generation
The generateSASUrl method supports two types of SAS tokens:
- User Delegation SAS (recommended): Used when authenticating with managed identity - more secure, no account keys required
- Service SAS: Used when authenticating with connection string - uses account key
import { BlobPermissions } from '@akadenia/azure-storage';
// Generate SAS URL for read access
const sasOptions = {
startsOn: new Date(),
expiresOn: new Date(Date.now() + 3600 * 1000), // 1 hour
permissions: [BlobPermissions.READ]
};
const sasUrl = await blobStorage.generateSASUrl('my-container', 'my-blob.txt', sasOptions);
console.log('SAS URL:', sasUrl.fullUrlWithSAS);
// Use SAS URL for secure access
const sasBlobStorage = new BlobStorage(sasUrl.fullUrlWithSAS);
const secureData = await sasBlobStorage.downloadBlob('my-container', 'my-blob.txt');Advanced SAS Examples
// Generate container-level SAS with write permissions
const containerSasOptions = {
permissions: [BlobPermissions.ADD, BlobPermissions.WRITE],
expiresOn: new Date(Date.now() + 24 * 3600 * 1000) // 24 hours
};
const containerSas = await blobStorage.generateSASUrl('my-container', undefined, containerSasOptions);
const sasClient = new BlobStorage(containerSas.fullUrlWithSAS);
// Upload using SAS URL
await sasClient.uploadData('my-container', 'secure-upload.txt', Buffer.from('Secure content'));User Delegation SAS with Managed Identity
When using managed identity, generateSASUrl automatically uses User Delegation SAS (more secure):
// Using managed identity - generates User Delegation SAS
const blobStorage = new BlobStorage({
accountName: 'mystorageaccount'
});
const sasUrl = await blobStorage.generateSASUrl('my-container', 'my-blob.txt', {
permissions: [BlobPermissions.WRITE],
expiresOn: new Date(Date.now() + 3600 * 1000)
});Table Storage
The TableStorage class provides methods to interact with Azure Table Storage.
Basic Setup
import { TableStorage, ITableEntity } from '@akadenia/azure-storage';
const tableStorage = new TableStorage(connectionString, 'MyTable');Table Management
// Create a table
const tableCreated = await tableStorage.createTable();
console.log('Table created:', tableCreated);
// Delete a table
const tableDeleted = await tableStorage.deleteTable();
console.log('Table deleted:', tableDeleted);Entity Operations
// Define an entity
interface UserEntity extends ITableEntity {
partitionKey: string;
rowKey: string;
name: string;
email: string;
age: number;
isActive: boolean;
}
// Insert an entity
const user: UserEntity = {
partitionKey: 'users',
rowKey: 'user-123',
name: 'John Doe',
email: '[email protected]',
age: 30,
isActive: true
};
const inserted = await tableStorage.insert(user);
console.log('User inserted:', inserted);
// Get an entity
const retrievedUser = await tableStorage.get('users', 'user-123');
console.log('Retrieved user:', retrievedUser);
// Update an entity
user.age = 31;
user.name = 'John Smith';
const updated = await tableStorage.update(user);
console.log('User updated:', updated);
// Upsert an entity (insert or update)
const newUser: UserEntity = {
partitionKey: 'users',
rowKey: 'user-456',
name: 'Jane Doe',
email: '[email protected]',
age: 25,
isActive: true
};
const upserted = await tableStorage.upsert(newUser);
console.log('User upserted:', upserted);
// Delete an entity
const deleted = await tableStorage.delete('users', 'user-123');
console.log('User deleted:', deleted);List Operations
// List all entities
const allUsers = await tableStorage.list<UserEntity>();
console.log('All users:', allUsers);
// List with options
const usersWithOptions = await tableStorage.list<UserEntity>({
queryOptions: { filter: "age gt 25" }
});
console.log('Users over 25:', usersWithOptions);Advanced Table Examples
// Batch operations example
const users = [
{ partitionKey: 'users', rowKey: 'user-1', name: 'Alice', age: 28 },
{ partitionKey: 'users', rowKey: 'user-2', name: 'Bob', age: 32 },
{ partitionKey: 'users', rowKey: 'user-3', name: 'Charlie', age: 24 }
];
// Insert multiple users
for (const user of users) {
await tableStorage.insert(user);
}
// Query with filters
const youngUsers = await tableStorage.list<UserEntity>({
queryOptions: { filter: "age lt 30" }
});
// Get table client for advanced operations
const tableClient = tableStorage.getTableClient();
// Use tableClient for more complex operationsQueue Storage
The QueueStorage class provides methods to interact with Azure Queue Storage.
Basic Setup
import { QueueStorage } from '@akadenia/azure-storage';
// Using connection string
const queueStorage = new QueueStorage(connectionString);
// Using managed identity (recommended for production)
const queueStorage = new QueueStorage({
accountName: 'yourstorageaccount'
});Message Operations
// Send a string message (auto base64 encoded by default)
const response = await queueStorage.sendMessage('my-queue', 'Hello, Queue Storage!');
console.log('Message ID:', response.messageId);
// Send an object message (auto JSON stringified and base64 encoded)
const orderMessage = {
orderNumber: '12345',
action: 'process',
timestamp: new Date().toISOString()
};
await queueStorage.sendMessage('my-queue', orderMessage);
// Send message without base64 encoding (if needed)
await queueStorage.sendMessage('my-queue', 'Plain text message', false);
// Receive messages from queue
const receiveResponse = await queueStorage.receiveMessages('my-queue', 5, 30);
console.log('Received:', receiveResponse.receivedMessageItems.length, 'messages');
// Process and delete messages
for (const message of receiveResponse.receivedMessageItems) {
// Decode base64 message
const decodedText = Buffer.from(message.messageText, 'base64').toString('utf-8');
const messageData = JSON.parse(decodedText);
console.log('Processing order:', messageData.orderNumber);
// Delete message after processing
await queueStorage.deleteMessage('my-queue', message.messageId, message.popReceipt);
}Queue Management
// Create a queue if it doesn't exist
await queueStorage.createQueue('my-new-queue');
// Check if queue exists
const exists = await queueStorage.queueExists('my-queue');
console.log('Queue exists:', exists);
// Get approximate message count
const messageCount = await queueStorage.getMessageCount('my-queue');
console.log('Approximate messages in queue:', messageCount);
// Clear all messages from queue
await queueStorage.clearMessages('my-queue');
// Delete a queue
await queueStorage.deleteQueue('old-queue');Peek Messages (Without Removing)
// Peek at messages without removing them from the queue
const peekResponse = await queueStorage.peekMessages('my-queue', 5);
for (const message of peekResponse.peekedMessageItems) {
console.log('Peeked message:', message.messageText);
}Advanced Queue Examples
// Get underlying QueueClient for advanced operations
const queueClient = queueStorage.getQueueClient('my-queue');
// Use QueueClient directly for custom operations
await queueClient.sendMessage('Custom message', {
visibilityTimeoutInSeconds: 30, // Hide for 30 seconds
timeToLiveInSeconds: 3600 // Expire after 1 hour
});
// Integration example: Order processing pipeline
class OrderProcessor {
private queueStorage: QueueStorage;
constructor(connectionString: string) {
this.queueStorage = new QueueStorage(connectionString);
}
async queueOrder(orderNumber: string): Promise<void> {
const message = { orderNumber, timestamp: Date.now() };
await this.queueStorage.sendMessage('orders-to-process', message);
console.log(`Order ${orderNumber} queued for processing`);
}
async processOrders(): Promise<void> {
const response = await this.queueStorage.receiveMessages('orders-to-process', 10);
for (const message of response.receivedMessageItems) {
try {
// Decode and parse message
const decoded = Buffer.from(message.messageText, 'base64').toString('utf-8');
const order = JSON.parse(decoded);
// Process order...
console.log(`Processing order ${order.orderNumber}`);
// Delete message after successful processing
await this.queueStorage.deleteMessage(
'orders-to-process',
message.messageId,
message.popReceipt
);
} catch (error) {
console.error('Failed to process message:', error);
// Message will become visible again after visibility timeout
}
}
}
}Configuration
Environment Variables
// Use environment variables for connection string
const connectionString = process.env.AZURE_STORAGE_CONNECTION_STRING;
if (!connectionString) {
throw new Error('AZURE_STORAGE_CONNECTION_STRING environment variable is required');
}Local Development with Azurite
For local development, you can use Azurite (Azure Storage Emulator):
// Azurite connection string
const localConnectionString = "UseDevelopmentStorage=true";
const blobStorage = new BlobStorage(localConnectionString);
const tableStorage = new TableStorage(localConnectionString, 'MyTable');
const queueStorage = new QueueStorage(localConnectionString);Managed Identity Authentication
For production environments, you can use Azure Managed Identity for passwordless authentication. This is the recommended approach for applications running in Azure.
System-Assigned Managed Identity (Most Common)
System-assigned managed identity is automatically created when you enable managed identity on your Azure resource (App Service, Function App, VM, etc.). You don't need to provide a client ID - Azure handles it automatically.
import { BlobStorage, TableStorage, QueueStorage } from '@akadenia/azure-storage';
// Blob Storage with system-assigned managed identity
// No client ID needed - Azure automatically uses the identity assigned to your resource
const blobStorage = new BlobStorage({
accountName: 'yourstorageaccount'
});
// Table Storage with system-assigned managed identity
const tableStorage = new TableStorage({
accountName: 'yourstorageaccount',
tableName: 'MyTable'
});
// Queue Storage
const queueStorage = new QueueStorage({
accountName: 'yourstorageaccount'
});User-Assigned Managed Identity (Advanced)
User-assigned managed identity is a standalone Azure resource that you create separately. Use this when you want to share one identity across multiple Azure resources. You must provide the client ID of the user-assigned managed identity.
// Blob Storage with user-assigned managed identity
// Requires the client ID of the user-assigned managed identity
const blobStorage = new BlobStorage({
accountName: 'yourstorageaccount',
managedIdentityClientId: 'your-client-id' // Required for user-assigned only
});
// Table Storage with user-assigned managed identity
const tableStorage = new TableStorage({
accountName: 'yourstorageaccount',
tableName: 'MyTable',
managedIdentityClientId: 'your-client-id' // Required for user-assigned only
});
// Queue Storage
const queueStorage = new QueueStorage({
accountName: 'yourstorageaccount'
});Environment-Based Configuration
// Use connection string in development, managed identity in production
const accountName = process.env.AZURE_STORAGE_ACCOUNT_NAME;
let blobStorage: BlobStorage;
if (process.env.NODE_ENV === 'production') {
// Production: Use system-assigned managed identity (most common)
// Only provide managedIdentityClientId if using user-assigned managed identity
blobStorage = new BlobStorage({
accountName: accountName!,
// managedIdentityClientId: process.env.AZURE_CLIENT_ID // Only if using user-assigned
});
} else {
// Development: Use connection string
blobStorage = new BlobStorage(process.env.AZURE_STORAGE_CONNECTION_STRING!);
}Note: When using managed identity, ensure your Azure resource (App Service, Function App, VM, etc.) has the appropriate role assignments:
- Storage Blob Data Contributor for Blob Storage operations
- Storage Table Data Contributor for Table Storage operations
Important Notes:
- SAS URL generation (
generateSASUrl) is now async and supports both:- User Delegation SAS when using managed identity (recommended for production)
- Service SAS when using connection string authentication
Error Handling
The library includes built-in error handling:
try {
const blobData = await blobStorage.downloadBlob('container', 'non-existent-blob.txt');
} catch (error) {
if (error.statusCode === 404) {
console.log('Blob not found');
} else {
console.error('Error downloading blob:', error);
}
}
try {
const user = await tableStorage.get('users', 'non-existent-user');
} catch (error) {
if (error.statusCode === 404) {
console.log('User not found');
} else {
console.error('Error retrieving user:', error);
}
}Best Practices
1. Connection String Security
// Store connection strings securely
const connectionString = process.env.AZURE_STORAGE_CONNECTION_STRING;
// Use SAS URLs for limited access
const sasUrl = await blobStorage.generateSASUrl('container', 'blob', {
permissions: [BlobPermissions.READ],
expiresOn: new Date(Date.now() + 3600 * 1000) // 1 hour
});2. Error Handling
// Always handle errors appropriately
async function safeBlobOperation() {
try {
const result = await blobStorage.uploadData('container', 'blob', data);
return result;
} catch (error) {
console.error('Blob operation failed:', error);
throw error;
}
}3. Resource Management
// Clean up resources when done
async function cleanup() {
try {
await blobStorage.deleteContainer('temp-container');
await tableStorage.deleteTable();
} catch (error) {
console.error('Cleanup failed:', error);
}
}4. Performance Optimization
// Use appropriate buffer sizes for large files
await blobStorage.uploadStream('container', 'large-file.zip', stream, {
blobContentType: 'application/zip'
}, 4 * 1024 * 1024); // 4MB buffer size
// Batch table operations when possible
const entities = [/* multiple entities */];
for (const entity of entities) {
await tableStorage.upsert(entity);
}API Reference
BlobStorage
| Method | Description | Returns |
|--------|-------------|---------|
| createContainer(containerName) | Creates a container if it doesn't exist | Promise<boolean> |
| deleteContainer(containerName) | Deletes a container if it exists | Promise<boolean> |
| uploadData(containerName, blobName, data, headers?) | Uploads data to a blob | Promise<boolean> |
| uploadStream(containerName, blobName, stream, headers?) | Uploads a stream to a blob | Promise<boolean> |
| downloadBlob(containerName, blobName) | Downloads a blob as Buffer | Promise<Buffer> |
| blobExists(containerName, blobName) | Checks if a blob exists | Promise<boolean> |
| listBlobs(containerName, prefix) | Lists blobs with a prefix | Promise<BlobItem[]> |
| deleteBlob(containerName, blobName) | Deletes a blob | Promise<boolean> |
| generateSASUrl(containerName, blobName?, options?) | Generates a SAS URL (User Delegation SAS with managed identity, Service SAS with connection string) | Promise<SASUrlComponents> |
TableStorage
| Method | Description | Returns |
|--------|-------------|---------|
| createTable() | Creates a table | Promise<boolean> |
| deleteTable() | Deletes a table | Promise<boolean> |
| insert(entity) | Inserts an entity | Promise<boolean> |
| update(entity) | Updates an entity | Promise<boolean> |
| upsert(entity) | Inserts or updates an entity | Promise<boolean> |
| get(partitionKey, rowKey) | Gets an entity | Promise<GetTableEntityResponse> |
| delete(partitionKey, rowKey) | Deletes an entity | Promise<boolean> |
| list(options?) | Lists entities | Promise<T[]> |
QueueStorage
| Method | Description | Returns |
|--------|-------------|---------|
| getQueueClient(queueName) | Gets a QueueClient for a specific queue | QueueClient |
| sendMessage(queueName, message, base64Encode?) | Sends a message to the queue | Promise<QueueSendMessageResponse> |
| receiveMessages(queueName, maxMessages?, visibilityTimeout?) | Receives messages from the queue | Promise<any> |
| deleteMessage(queueName, messageId, popReceipt) | Deletes a message from the queue | Promise<void> |
| peekMessages(queueName, maxMessages?) | Peeks messages without removing them | Promise<any> |
| clearMessages(queueName) | Clears all messages from the queue | Promise<void> |
| createQueue(queueName) | Creates a queue if it doesn't exist | Promise<void> |
| deleteQueue(queueName) | Deletes a queue | Promise<void> |
| queueExists(queueName) | Checks if a queue exists | Promise<boolean> |
| getMessageCount(queueName) | Gets approximate message count | Promise<number> |
License
Contributing
We welcome contributions! Please feel free to submit a Pull Request.
Development Setup
git clone https://github.com/akadenia/AkadeniaAzureStorage.git
cd AkadeniaAzureStorage
npm install
npm run build
npm testCommit Message Guidelines
We follow Conventional Commits for our semantic release process. We prefer commit messages to include a scope in parentheses for better categorization and changelog generation.
Preferred Format
type(scope): description
[optional body]
[optional footer]Examples
## ✅ Preferred - with scope
feat(blob): add new SAS URL generation options
fix(table): resolve entity deletion issue
docs(readme): add troubleshooting section
chore(deps): update azure dependencies
## ❌ Less preferred - without scope
feat: add new SAS URL generation options
fix: resolve entity deletion issue
docs: add troubleshooting section
chore: update azure dependenciesCommon Scopes
blob- Blob Storage functionalitytable- Table Storage functionalityqueue- Queue Storage functionalitydocs- Documentation updatesdeps- Dependency updatestest- Test-related changesbuild- Build and build toolingci- CI/CD configuration
Support
For support, please open an issue on GitHub.
