piper-utils
v1.1.68
Published
Utility library for Piper
Readme
Piper Utils
Utility library for building AWS Lambda microservices with Cognito authentication, Sequelize ORM queries, S3 file processing pipelines, and standardized API Gateway responses.
Table of Contents
- Installation
- Exports
- Request / Response
- Authentication & Access Control
- JWT Claims Reference
- Database Query Helpers
- Query String DSL
- Event Manager (S3 Pipeline)
- Database Migrations
- Built-in Error Codes
- Examples
- Peer Dependencies
- Testing
Installation
npm install piper-utilsExports
Every public function is re-exported from the package root:
import {
// Request / Response
success, successHtml, failure, parseBody, parseEvent,
getCurrentUser, getCurrentUserNameFromCognitoEvent,
// Authentication & Access Control
accessRightsUtils, checkWriteAccess, checkModule, checkIsSuper,
isSystemUser, isSuperUser,
isPartnerUser, getBelongsToPartnerId, getEffectivePartnerId,
enrichEventWithPartnerAccess,
userDefaultBid, getBusinessesInfo,
getAccessRightsInfo, getDefaultBusinessIDInfo, getModuleInfo,
getCompanySettings,
// Database Query Helpers (Sequelize)
defaultFilters, createFilters, createSort, createIncludes, findAll,
// Event Manager (S3 Pipeline)
watchBucket, handleFile, publishEvents,
// Database Migrations
runMigrations
} from 'piper-utils';Note: S3Utils, SNSUtils, and dynamoUtil are internal modules used by the event manager. They are not exported from the package root.
Request / Response
Functions for formatting API Gateway Lambda proxy responses with CORS and security headers.
All JSON responses include these security headers:
Strict-Transport-Security(HSTS)X-Content-Type-Options: nosniffX-Frame-Options: DENYContent-Security-Policy: default-src 'none'; frame-ancestors 'none'Referrer-Policy: strict-origin-when-cross-originPermissions-Policy(camera, microphone, geolocation disabled)Cache-Control: no-store
success(body, options?)
Format a 200 JSON response.
return success({ id: 1, name: 'Widget' });
// => { statusCode: 200, headers: {...}, body: '{"id":1,"name":"Widget"}' }| Param | Type | Description |
|-------|------|-------------|
| body | any | Response data (will be JSON.stringify'd) |
| options.dbClose | function | Optional callback invoked before returning (e.g. close DB connection) |
successHtml(html, options?)
Format a 200 HTML response with a relaxed CSP that allows payment provider scripts (TokenEx, NMI, Apple Pay, Sentry).
return successHtml('<html>...</html>');
// => { statusCode: 200, headers: { 'Content-Type': 'text/html', ... }, body: '<html>...</html>' }| Param | Type | Description |
|-------|------|-------------|
| html | string | Raw HTML content |
| options.dbClose | function | Optional callback invoked before returning |
failure(body?, options?)
Format an error response. Automatically detects and normalizes Joi, Sequelize, and Dynamoose errors.
// Throw a known error
throw { statusCode: 404, errorCode: '4004', message: 'ITEM NOT FOUND' };
// failure() catches and formats it
return failure(err);
// => { statusCode: 404, headers: {...}, body: '{"statusCode":404,"errorCode":"4004","message":"ITEM NOT FOUND"}' }Auto-detection logic:
| Error type | Detection | Resulting statusCode / errorCode |
|------------|-----------|----------------------------------|
| Joi validation | body.details exists | 400 / 4000 |
| Sequelize ForeignKeyConstraint | body.name === 'SequelizeForeignKeyConstraintError' | 409 / 4090 |
| Sequelize UniqueConstraint | body.name === 'SequelizeUniqueConstraintError' | 409 / 4091 |
| Sequelize ValidationError | body.name === 'SequelizeValidationError' | 400 / 4001 |
| Unknown | fallback | 500 / 5XX |
| Param | Type | Description |
|-------|------|-------------|
| body | object/Error | Error object. If it has statusCode and errorCode, used as-is. |
| options.dbClose | function | Optional callback invoked before returning |
parseBody(event)
Parse the JSON body from an API Gateway Lambda proxy event. Throws errorList.invalidJson (400) on malformed JSON.
const body = parseBody(event);
// body is the parsed JSON object, or {} if event.body is falsyReturns the event.body as-is if it's already an object (e.g. from serverless-offline).
parseEvent(event, callback?)
Like parseBody but for parsing an entire event string. Optionally calls callback(error) on failure instead of throwing.
const parsed = parseEvent(eventString);getCurrentUser(event)
Extract user info from Cognito JWT authorizer claims.
const user = getCurrentUser(event);
// => { username: '[email protected]', id: 42 }| Return field | Source claim | Default |
|-------------|-------------|---------|
| id | custom:UID (JSON-parsed) | 0 |
| username | email | '[email protected]' |
getCurrentUserNameFromCognitoEvent(event)
Extract email from a Cognito User Pool trigger event (different structure than API Gateway authorizer events).
// Inside a Cognito Pre-Sign-Up or Post-Confirmation trigger
const email = getCurrentUserNameFromCognitoEvent(event);
// Reads from event.request.userAttributes['cognito:email_alias'] or ['email']Authentication & Access Control
All auth functions read JWT claims from event.requestContext.authorizer.claims (or event.requestContext.authorizer for custom authorizers). See JWT Claims Reference for the full claim structure.
Local bypass: When BUILD_ENV=local, most auth checks are relaxed to ease development.
accessRightsUtils(event, options?)
Get the list of business IDs the current user is authorized to access. Compares requested IDs (from query string or body) against allowed IDs (from JWT claims).
const businessIds = accessRightsUtils(event, { useCognitoBid: true });
// => ['1', '5', '12']| Param | Type | Description |
|-------|------|-------------|
| event | object | Lambda event |
| options.useCognitoBid | boolean | If true, prefer Cognito businessId; skip local fallback of BID '1' |
Returns: string[] - business IDs user can access.
Behavior by user type:
- Super user: Gets all requested IDs; if none requested, gets all allowed IDs
- System user: Gets all requested IDs (no filtering)
- Regular user: Gets the intersection of requested and allowed IDs
checkWriteAccess(event, options?)
Verify the user has write (W) or admin (A) role for the businessId in the request body. Throws errorList.unauthorized (401) if denied.
const businessId = checkWriteAccess(event);
// Returns the authorized businessId stringBypassed for: super users, system users, BUILD_ENV=local.
checkModule(moduleName, event)
Verify the user has access to a named module. Throws errorList.unauthorized (401) if denied.
checkModule('customer', event); // throws if no access
checkModule('inventory', event); // throws if no accessReads module permissions from custom:MOD (falls back to custom:AR). Bypassed for: super users, system users, BUILD_ENV=local.
checkIsSuper(event)
Throws errorList.unauthorized (401) if the user is not a super or system user. Bypassed for BUILD_ENV=local.
checkIsSuper(event); // throws if not super/systemisSuperUser(event) / isSystemUser(event)
Boolean checks. No throwing, no bypass.
if (isSuperUser(event)) { /* full access */ }
if (isSystemUser(event)) { /* machine-to-machine */ }isPartnerUser(event)
Returns the partner ID from custom:PID, or false.
getBelongsToPartnerId(event)
Returns the partner ID from custom:BPID (user's business belongs to a partner), or false.
getEffectivePartnerId(event)
Returns isPartnerUser(event) || getBelongsToPartnerId(event) - the partner ID from either claim, or false.
enrichEventWithPartnerAccess(event, partnerBusinessIds, role?)
Mutate the event's custom:AR claim in-memory to add partner business IDs. Call this before accessRightsUtils() or getBusinessesInfo() so partner admins get access to their partner's merchants.
// After looking up partner's business IDs from your DB:
enrichEventWithPartnerAccess(event, ['101', '102', '103'], 'R');
// Now accessRightsUtils(event) will include 101, 102, 103| Param | Type | Default | Description |
|-------|------|---------|-------------|
| event | object | | Lambda event (mutated in-place) |
| partnerBusinessIds | string[] | | Business IDs to add |
| role | string | 'R' | Access role: 'R' (read), 'W' (write), 'A' (admin) |
userDefaultBid(event)
Get the user's default business ID from custom:DBI.
const defaultBid = userDefaultBid(event);
// => '5' (or '1' as fallback)getBusinessesInfo(event, useCognitoBid?)
Get the raw business ID to role mapping from custom:AR.
const businesses = getBusinessesInfo(event);
// => { '1': 'A', '5': 'W', '12': 'R' }When BUILD_ENV=local, automatically injects { '1': 'A' } and any businessId from the request body.
Low-level Claim Helpers
These extract raw claim values without business logic:
getAccessRightsInfo(event)- Returnscustom:AR.businessIdsas an object (e.g.{ '1': 'A', '5': 'R' })getDefaultBusinessIDInfo(event)- Returnscustom:DBI.defaultBidas a string (default'1')getModuleInfo(event)- Returnscustom:MOD.module(orcustom:AR.module) as an object
getCompanySettings(event)
Get company-level cached settings from custom:SET.
const settings = getCompanySettings(event);
// => { auditEnabled: true, ... }Returns {} on parse error.
JWT Claims Reference
The library expects these custom Cognito attributes on event.requestContext.authorizer.claims (or event.requestContext.authorizer for custom authorizers):
| Claim | Type | Example | Used by |
|-------|------|---------|---------|
| custom:UID | JSON number | "42" | getCurrentUser |
| email | string | "[email protected]" | getCurrentUser |
| custom:SYSTEM | JSON boolean | "true" | isSystemUser |
| custom:SUPER | JSON boolean | "true" | isSuperUser |
| custom:AR | JSON object | {"businessIds":{"1":"A","5":"R"}} | accessRightsUtils, checkWriteAccess, getBusinessesInfo |
| custom:DBI | JSON object | {"defaultBid":"5"} | userDefaultBid |
| custom:MOD | JSON object | {"module":{"customer":true}} | checkModule |
| custom:PID | string | "PARTNER_123" | isPartnerUser |
| custom:BPID | string | "PARTNER_123" | getBelongsToPartnerId |
| custom:SET | JSON object | {"auditEnabled":true} | getCompanySettings |
Role values in custom:AR.businessIds: A (admin), W (write), R (read).
Database Query Helpers
Utilities for translating API query string parameters into Sequelize queries. These handle filtering, sorting, pagination, and relation includes automatically.
defaultFilters(schema, subSchemas?)
Create a filter configuration object from a Sequelize model schema. This maps each column's data type to the appropriate Sequelize operator.
import DB from 'sequelize';
const orderSchema = {
orderNumber: { type: new DB.STRING() },
status: { type: new DB.STRING() },
total: { type: new DB.DECIMAL() },
active: { type: new DB.BOOLEAN() },
metadata: { type: DB.JSONB },
createdAt: { type: new DB.DATE() }
};
const orderFilter = defaultFilters(orderSchema, {
customer: customerSchema // enable customer.* filtering
});Type-to-operator mapping:
| Sequelize type | Operator | Behavior |
|---------------|----------|----------|
| STRING, CHAR, TEXT | Op.iLike | Case-insensitive LOWER(col) LIKE '%value%' |
| INTEGER, DECIMAL, BIGINT | Op.or | Matches +value or -value |
| BOOLEAN | Op.eq | Exact match with string-to-boolean coercion |
| JSONB | (special) | Case-insensitive search via jsonb_extract_path_text |
| DATE | Op.between | Range filter |
| DATEONLY | Op.eq | Exact match |
Auto-added fields: id, createdAt, updatedAt are always included.
createFilters(event, objectFilters)
Convert query string parameters into a Sequelize WHERE clause. Automatically applies:
- Business ID scoping via
accessRightsUtils(event) - Active record filtering (defaults
active: trueif the schema has anactivecolumn) - Date range filtering when both
startDateandendDateare provided - Full-text search when
searchStringis provided
// URL: /orders?status=Shipped&searchString=john&startDate=2024-01-01&endDate=2024-12-31&sort=-createdAt
const where = createFilters(event, orderFilter);See Query String DSL for all supported operators.
createSort(event, defaultFilter)
Convert the sort query parameter into a Sequelize ORDER BY array.
// URL: /orders?sort=-createdAt,status
const order = createSort(event, orderFilter);
// => [['createdAt', 'DESC'], ['status', 'ASC']]- Prefix with
-for DESC, no prefix for ASC - Comma-separated for multiple fields
- Default:
-updatedAt(DESC) - Only allows sorting on fields defined in
defaultFilter(prevents injection)
createIncludes(event, objectFilters)
Build a Sequelize includes array by detecting which relations are referenced in filter or sort parameters (via dot-notation).
// URL: /[email protected]&sort=-customer.name
const includes = createIncludes(event, orderFilter);
// => ['customer']findAll(model, options)
Paginated wrapper around Model.findAll(). Fetches limit + 1 records to detect hasMore without a COUNT query.
const result = await findAll(Order, {
where,
order,
limit: 20,
offset: 0,
includes: [{ model: Customer, as: 'customer' }]
});
// => { offset: 0, limit: 20, rows: [...], hasMore: true }| Param | Type | Default | Description |
|-------|------|---------|-------------|
| model | Sequelize.Model | | Sequelize model class |
| options.where | object | | WHERE clause (from createFilters) |
| options.order | array | | ORDER BY (from createSort) |
| options.includes | array | { all: true, nested: true } | Relations to include |
| options.limit | number | 0 (no limit) | Page size |
| options.offset | number | 0 | Page offset |
Returns: { offset, limit, rows, hasMore }
Query String DSL
Supported query string parameters for createFilters and createSort:
| Query | Behavior |
|-------|----------|
| ?field=value | Match by type: iLike for strings, eq for booleans, or for numbers |
| ?searchString=john | OR search across all filterable string/numeric fields |
| ?startDate=2024-01-01&endDate=2024-12-31 | BETWEEN on createdAt |
| ?sort=-field | Sort DESC (prefix -) |
| ?sort=field1,-field2 | Multi-field sort, comma-separated |
| ?token.cardHolderName=gregory | JSONB dot-notation search (case-insensitive) |
| ?token={"cardType":"visa"} | JSONB object search (all key-value pairs, case-insensitive) |
| ?customer.email=john | Relation field filter (auto-includes the relation) |
Event Manager (S3 Pipeline)
A set of utilities for building S3 file processing pipelines. The typical flow is:
- Cron job triggers
watchBucket-> callspublishEventsto scan S3 and publish file batches to SNS - SNS trigger invokes Lambda ->
watchBucketroutes tohandleEventswhich processes files in parallel - Direct S3 trigger invokes Lambda ->
watchBucketroutes tohandleDirectS3WriteEvent
watchBucket(params)
Entry point for the S3 pipeline. Routes incoming events to the appropriate handler based on event source.
import { watchBucket } from 'piper-utils';
export async function handler(event) {
return watchBucket({
event,
dynamoConfigTable: 'Config-Dev',
dynamoConfigKey: 'importConfig',
s3Bucket: 'my-data-bucket',
snsTopic: 'my-import-topic',
transformer: async (parsedJson) => {
// Process each file's parsed JSON content
await saveToDatabase(parsedJson);
},
errorHandlerPerFile: (err, filePath) => {
console.error(`Failed: ${filePath}`, err);
return false; // return true to suppress error, false to fail
},
shouldSkipFailedFolders: false,
userImportTypes: { orders: 'orders', customers: 'customers' }
});
}| Param | Type | Required | Description |
|-------|------|----------|-------------|
| event | object | Yes | Lambda event (SNS, S3, or CloudWatch cron) |
| dynamoConfigTable | string | Yes | DynamoDB table name for pipeline config |
| dynamoConfigKey | string | Yes | Key in DynamoDB table (holds snsChunkSize and snsMaxMessages) |
| s3Bucket | string | Yes | S3 bucket name to watch |
| snsTopic | string | Yes | SNS topic name for publishing file batches |
| transformer | function | Yes | async (parsedJson) => result - processes each file's content |
| errorHandlerPerFile | function | No | (error, filePath) => boolean - return true to suppress, false to fail batch |
| shouldSkipFailedFolders | boolean | No | If true, skip retry folders and move directly to error/ |
| userImportTypes | object | No | Map of import type subdirectories |
Routing logic:
EventSource === 'aws:sns'->handleEvents()EventSource === 'aws:s3'->handleDirectS3WriteEvent()(skips files infailed-once/,failed-twice/,error/)EventSource === 'aws:s3'with key prefixDIRECT->handleDirectS3WriteEvent()- Otherwise (cron job, no Records) ->
publishEvents()
handleFile(path, s3Bucket, transformer, options?)
Process a single file from S3. Downloads the file, parses it as JSON, passes it to the transformer, then deletes the original. On error, moves the file through the retry folder strategy.
import { handleFile } from 'piper-utils';
const result = await handleFile(
'orders/order-123.json',
'my-data-bucket',
async (parsedJson) => {
// parsedJson is the JSON.parse'd file content
await Order.create(parsedJson);
return { processed: true };
},
{ shouldSkipFailedFolders: false }
);| Param | Type | Description |
|-------|------|-------------|
| path | string | S3 object key |
| s3Bucket | string | S3 bucket name |
| transformer | function | async (parsedJson) => result |
| options.shouldSkipFailedFolders | boolean | Skip retry folders, move straight to error/ |
| options.userImportTypes | object | Import type subdirectories |
Returns: The transformer's return value, or undefined if the file was empty/missing.
Behavior:
- Returns silently if file not found (
NoSuchKey) - Returns silently (and deletes file) if body is empty,
{}, or[] - On success: deletes the original file
- On error: moves file through retry folders, then re-throws
handleEvents(event, transformer, errorHandlerPerFile?, shouldSkipFailedFolders?, userImportTypes?)
Process a batch of files from SNS-relayed S3 events. Processes files in parallel using Bluebird.Promise.map.
// event.Records[].Sns.Message = JSON.stringify({ bucket: '...', files: ['file1.json', 'file2.json'] })
await handleEvents(event, transformer, errorHandlerPerFile, false, userImportTypes);| Param | Type | Description |
|-------|------|-------------|
| event | object | SNS event with Records[].Sns.Message containing { bucket, files } |
| transformer | function | async (parsedJson) => result |
| errorHandlerPerFile | function | (error, filePath) => boolean - return true to suppress |
| shouldSkipFailedFolders | boolean | Default false |
| userImportTypes | object | Import subdirectories |
Throws 'ERROR: HANDLE-FILE' if any file fails and errorHandlerPerFile returns false (or is not provided).
publishEvents(configTable, tableKey, bucket, snsTopic, userImportTypes?)
Scan an S3 bucket and publish file batches to SNS. Typically called by a cron job via watchBucket.
await publishEvents('Config-Dev', 'importConfig', 'my-data-bucket', 'my-import-topic');| Param | Type | Description |
|-------|------|-------------|
| configTable | string | DynamoDB table with pipeline config |
| tableKey | string | Config key (item must have snsChunkSize and snsMaxMessages) |
| bucket | string | S3 bucket to scan |
| snsTopic | string | SNS topic name |
| userImportTypes | object | Import subdirectories (optional) |
Config lookup: Reads Item.snsChunkSize (default: 2) and Item.snsMaxMessages (default: 10) from DynamoDB.
File filtering: Skips files in error/, failed-once/, failed-twice/. Supports delayUntil/ folder (files processed only after the time encoded in the path, format YYYYMMDDHHmm).
Publishing: Lists all eligible files, chunks them by snsChunkSize, publishes up to snsMaxMessages SNS messages. Each message body: { bucket, files: [...keys] }.
Retry / Error Folder Strategy
When handleFile encounters an error processing a file, it moves the file through progressive retry folders before giving up:
file.json --(error)--> failed-once/file.json
--(error)--> failed-twice/file.json
--(error)--> error/file.json (terminal)If shouldSkipFailedFolders: true, files go directly to error/ on first failure.
With userImportTypes, the same pattern applies within each import type's subfolder:
orders/file.json --> orders/failed-once/file.json --> orders/failed-twice/file.json --> orders/error/file.jsonFiles in error/ are never reprocessed. A nesting guard prevents double-nesting (e.g. failed-once/failed-once/file.json).
Database Migrations
runMigrations(databaseName, sequelizeInstance, initializeModels, pathToMigrationFolder?)
Execute pending database migrations using Umzug. On failure, rolls back pending migrations before throwing.
import { runMigrations } from 'piper-utils';
await runMigrations(
'my-database',
sequelizeInstance,
async () => { await sequelizeInstance.sync(); },
`${__dirname}/migrations/*.js` // optional, defaults to ${PWD}/migrations/*.js
);| Param | Type | Default | Description |
|-------|------|---------|-------------|
| databaseName | string | | Database name (for logging) |
| sequelizeInstance | Sequelize | | Initialized Sequelize instance |
| initializeModels | function | | Async function to define/sync models (called after migrations) |
| pathToMigrationFolder | string | ${PWD}/migrations/*.js | Glob pattern for migration files |
Built-in Error Codes
The library includes a pre-defined errorList used internally by auth checks and parseBody. These are the error objects that get thrown/returned:
| Key | errorCode | statusCode | Message |
|-----|-----------|------------|---------|
| unauthorized | 4111 | 401 | UNAUTHORIZED |
| notFound | 4004 | 404 | ITEM NOT FOUND |
| invalidJson | 4005 | 400 | INVALID JSON |
| invalidRequest | 4016 | 400 | INVALID REQUEST DATA |
| invalidID | 4014 | 400 | ID is invalid |
| invalidFilter | 4026 | 400 | INVALID FILTER |
| invalidStartDate | 4020 | 400 | INVALID START DATE |
| invalidEndDate | 4021 | 400 | INVALID END DATE |
| invalidDateFormat | 4019 | 400 | INVALID DATE FORMAT |
| invalidAPIKey | 4017 | 400 | INVALID REQUEST - API MAY KEY INVALID |
| invalidUserNameUpdate | 4027 | 400 | UNABLE TO UPDATE USERNAME |
| imageSizeLimit | 4028 | 400 | IMAGE SIZE LIMIT 100KB EXCEEDED |
| emailRequired | 4004 | 404 | NO EMAIL PROVIDED, CHECK CUSTOMER EMAIL |
| mobilePhoneRequired | 4004 | 404 | NO MOBILE PHONE PROVIDED, CHECK CUSTOMER CONTACTS |
| invalidCadenceType | 5001 | 500 | INVALID CADENCE TYPE |
Consuming services typically define their own errorList that extends or mirrors this pattern:
const errorList = {
paymentFailed: { statusCode: 400, errorCode: '4030', message: 'PAYMENT FAILED' }
};
// Throw it — failure() will format it correctly
throw errorList.paymentFailed;Examples
Complete Lambda Handler (Read)
import {
accessRightsUtils, checkModule,
createFilters, createSort, findAll,
success, failure
} from 'piper-utils';
export async function getOrders(event) {
try {
checkModule('orders', event);
const where = createFilters(event, orderFilter);
const order = createSort(event, orderFilter);
const query = event.queryStringParameters || {};
const result = await findAll(Order, {
where,
order,
limit: parseInt(query.limit || '20'),
offset: parseInt(query.offset || '0')
});
return success(result);
} catch (err) {
return failure(err);
}
}Note:
createFiltersautomatically scopes the query to the user's authorized business IDs viaaccessRightsUtils(event). You do not need to addwhere.businessIdmanually.
Complete Lambda Handler (Write)
import {
parseBody, getCurrentUser, checkWriteAccess,
success, failure
} from 'piper-utils';
export async function updateOrder(event) {
try {
const user = getCurrentUser(event);
// => { username: '[email protected]', id: 42 }
const businessId = checkWriteAccess(event);
// Throws 401 if user lacks write/admin role for the businessId in body
const body = parseBody(event);
const order = await Order.findOne({
where: { id: event.pathParameters.id, businessId }
});
if (!order) throw { statusCode: 404, errorCode: '4004', message: 'Order not found' };
order.set({ ...body, updatedBy: user.id });
await order.save();
return success(order);
} catch (e) {
return failure(e);
}
}S3 File Processing Pipeline
import { watchBucket, success, failure } from 'piper-utils';
export async function importHandler(event) {
try {
return await watchBucket({
event,
dynamoConfigTable: 'Config-Dev',
dynamoConfigKey: 'orderImportConfig',
s3Bucket: 'order-imports-dev',
snsTopic: 'order-import-topic-dev',
transformer: async (parsedJson) => {
// Each file contains a JSON order object
await Order.create(parsedJson);
},
errorHandlerPerFile: (err, filePath) => {
console.error(`Import failed for ${filePath}:`, err);
return false; // don't suppress — let retry folders handle it
}
});
} catch (e) {
return failure(e);
}
}Partner Access Enrichment
import {
getEffectivePartnerId, enrichEventWithPartnerAccess,
accessRightsUtils, success, failure
} from 'piper-utils';
export async function getPartnerOrders(event) {
try {
const partnerId = getEffectivePartnerId(event);
if (partnerId) {
// Look up which businesses belong to this partner
const partnerBusinessIds = await getPartnerBusinessIds(partnerId);
// Inject them into the event so accessRightsUtils includes them
enrichEventWithPartnerAccess(event, partnerBusinessIds, 'R');
}
const businessIds = accessRightsUtils(event);
// Now includes both user's own businesses AND partner businesses
// ... query with businessIds
return success(results);
} catch (e) {
return failure(e);
}
}Peer Dependencies
These must be installed in the consuming project:
| Package | Version |
|---------|---------|
| bluebird | >= 3.7.0 |
| dayjs | ^1.11.13 |
| lodash | >= 4.17.15 |
| sequelize | >= 6.6.2 |
| umzug | >= 3.2.1 |
Testing
npm test # unit tests (BUILD_ENV=test)
npm run itest # integration tests (BUILD_ENV=development)Tests use Jasmine 5 with NYC coverage. Coverage thresholds: branches >= 70%, functions >= 70%, statements >= 85%, lines >= 80%.
License
Private - Copyright (c) Piper
