@pavanigadda.work/core
v0.1.2
Published
Shared internal npm package for Pinaka TypeScript microservices — Prisma, logging, repository engine, auth, validation, audit helpers
Downloads
273
Maintainers
Readme
@pinaka/core
Shared internal npm package for Pinaka TypeScript microservices.
Eliminates CRUD/repository/query boilerplate. Backend engineers write business logic, not Prisma where-clauses.
What it does
| Concern | What the package provides |
|---|---|
| Database access | Prisma singleton, transaction wrapper, health check |
| Logging | Pino child logger with RequestContext binding, redaction |
| Auth | JWT sign/verify (jose), context extraction, bcrypt password helpers |
| Validation | Zod schemas for pagination, sort, filter, search, UUIDs, PAN/GSTIN |
| Query safety | Registry-driven whitelist for sort/filter/include — no arbitrary Prisma injection |
| Sensitive fields | Centralized registry — add once, redacted in logs + audit + responses automatically |
| Repository | Generic engine: createOne, getOne, listMany, updateOne, archiveOne, count, exists |
| Pagination | Offset (page/limit) + cursor-based (2x faster, no COUNT query) — dual mode |
| Multi-column sort | Sort by up to 3 fields: sortBy: ['status', 'createdAt'] |
| Branch scoping | Automatic branchId injection on every query — prevents cross-branch leakage |
| Soft deletes | Archive strategies: status enum, is_active boolean, closed_at timestamp |
| Audit logging | Structured audit entries (V1 to Pino, upgradable to DB) |
| Errors | Typed error classes with errorCode + statusCode + express middleware |
| Utilities | PAN/GSTIN/email normalization, object pick/omit, masking, date helpers |
| Response helpers | successResponse, paginatedResponse, emptyResponse |
| Domain wrappers | Thin client/task/engagement/document repository wrappers |
Why it exists
Without a shared package, every microservice re-implements:
- Prisma connection management
- Pagination/sorting/filtering logic
- Branch scope injection
- Audit logging
- Error classes and Express error handler
- Logger setup
@pinaka/core provides the 80% generic engine. Service code only needs the 20% business logic.
Installation
npm install @pinaka/core
# peer dep — consumer must have @prisma/client
npm install @prisma/clientThe package is private: true — publish to your internal npm registry or use a workspace/monorepo link.
Quick start
import {
// Core
createRequestContext,
createModuleLogger,
// Repository
listMany,
createOne,
getOne,
registerModel,
validateRegisteredModels,
// Auth
hashPassword,
verifyToken,
// Errors
NotFoundError,
expressErrorHandler,
// Sensitive fields
SENSITIVE_FIELD_REGISTRY,
getAllSensitiveFields,
} from '@pinaka/core'RequestContext
RequestContext is the lightweight per-request identity carrier. Build it once in Express middleware and pass it throughout.
import { createRequestContext, type RequestContext } from '@pinaka/core'
import { randomUUID } from 'node:crypto'
// In requestId middleware:
app.use((req, res, next) => {
const requestId = (req.headers['x-request-id'] as string) ?? randomUUID()
res.locals.requestId = requestId
next()
})
// In auth middleware (after token verification):
app.use(async (req, res, next) => {
const token = req.headers.authorization?.split(' ')[1]
if (!token) return next()
try {
const ctxFields = await extractContextFromToken(token, process.env.JWT_SECRET!)
res.locals.ctx = createRequestContext({
requestId: res.locals.requestId,
...ctxFields,
ipAddress: req.ip,
userAgent: req.headers['user-agent'],
})
} catch (e) {
return next(e)
}
next()
})Fields on RequestContext:
requestId— required, UUID per requestuserId— from JWT subroleId— from JWTroleCodes— from JWT, e.g.['CA', 'MANAGER']branchId— from JWT, drives branch scopingipAddress,userAgent— for audit/logging
Logger
All logs are structured JSON (Pino). In development, uses pino-pretty for readable output.
createModuleLogger
import { createModuleLogger } from '@pinaka/core'
// Pass ctx + logical module name + source file name
const log = createModuleLogger(ctx, 'client-service', 'client.service.ts')
log.info({ action: 'listClients', durationMs: 14 }, 'Fetched clients')
log.error({ action: 'createClient', errorCode: 'CONFLICT', err }, 'Failed to create client')Do not use stack trace inference for file names — always pass fileName explicitly.
Structured log fields
Every log line can carry: service, env, requestId, userId, roleId, roleCodes, branchId, module, file, action, model, durationMs, success, errorCode, errorMessage.
Sensitive field redaction
Sensitive fields are managed via a central registry (SENSITIVE_FIELD_REGISTRY) and automatically redacted from all logs. See Sensitive Field Registry for details.
Environment variables
| Variable | Default | Description |
|---|---|---|
| LOG_LEVEL | info (prod), debug (dev) | Pino log level |
| SERVICE_NAME | @pinaka/core | service field on every log line |
| NODE_ENV | development | production disables pino-pretty |
Morgan + Pino
Morgan handles HTTP access logs. Pino handles application logs. To avoid duplicating request logs, pipe Morgan output through a Pino stream:
import morgan from 'morgan'
import { logger } from '@pinaka/core'
// Send Morgan output to Pino as info-level entries
app.use(morgan('combined', {
stream: { write: (msg) => logger.info({ action: 'httpAccess' }, msg.trim()) },
}))Prisma singleton
import { getPrisma, withTransaction, healthCheck } from '@pinaka/core'
// Use the singleton
const prisma = getPrisma()
// Transaction
const result = await withTransaction(async (tx) => {
const client = await tx.client.create({ data: { ... } })
await tx.auditLog.create({ data: { ... } })
return client
}, ctx)
// Health check (for /health endpoints)
const health = await healthCheck()
// { ok: true, durationMs: 3 }The Prisma singleton is protected against duplicate instantiation during dev hot-reload via a globalThis._prisma guard.
Environment variables
| Variable | Default | Description |
|---|---|---|
| DATABASE_URL | required | Prisma connection string |
| SLOW_QUERY_MS | 200 | Queries above this threshold are logged as warnings |
Model registry
The registry is the source of truth for safe query behavior. Register your models at service startup.
import { registerModel } from '@pinaka/core'
registerModel('invoice', {
prismaModel: 'invoice',
defaultSort: { field: 'issuedAt', order: 'desc' },
sortableFields: ['issuedAt', 'amount', 'status'],
filterableFields: ['status', 'clientId', 'branchId'],
searchableFields: ['invoiceNumber'],
includableRelations: ['client', 'lineItems'],
selectableFields: ['id', 'invoiceNumber', 'amount', 'status', 'issuedAt', 'clientId'],
hiddenFields: [],
defaultLimit: 20,
maxLimit: 100,
isBranchScoped: true,
archiveStrategy: { type: 'status', field: 'status', value: 'CANCELLED' },
auditEnabled: true,
})Pre-registered models: client, client_contact, client_engagement, task, subtask, compliance_item, case_record, document.
Startup validation
Call validateRegisteredModels() once at application startup (after Prisma is ready) to fail fast if any registered model doesn't exist in the consumer's Prisma schema:
import { validateRegisteredModels, getPrisma } from '@pinaka/core'
const prisma = getPrisma()
await prisma.$connect()
// Fails immediately listing ALL invalid models at once
validateRegisteredModels(prisma)
// ✓ Logs: "All registered models validated against Prisma schema" with model countReverse model lookup
import { getRegistryKeyForPrismaModel } from '@pinaka/core'
getRegistryKeyForPrismaModel('clientContact') // → 'client_contact'
getRegistryKeyForPrismaModel('task') // → 'task'
// Throws BadQueryError for unregistered prisma modelsRepository engine
import {
createOne, getOne, listMany, updateOne, archiveOne,
count, exists, upsertOne, updateManyByIds, archiveManyByIds
} from '@pinaka/core'
// List with pagination/sort/filter/search — safe, registry-driven
// Offset pagination (default)
const result = await listMany('client', req.query, ctx)
// result.data: Client[]
// result.meta: { type: 'offset', page, limit, total, totalPages, hasNextPage, hasPrevPage }
// Cursor pagination — 2x faster (no COUNT query)
const cursorResult = await listMany('client', { cursor: 'last-seen-id', limit: 20 }, ctx)
// cursorResult.meta: { type: 'cursor', limit, nextCursor: 'abc123', hasMore: true }
// Create
const client = await createOne('client', { displayName: 'Acme Ltd', branchId: ctx.branchId }, ctx)
// Get by any where clause
const client = await getOne('client', { id: req.params.id }, ctx)
if (!client) throw new NotFoundError('Client not found')
// Update
const updated = await updateOne('client', id, { displayName: 'New Name' }, ctx)
// Archive (soft delete per model's archiveStrategy)
const archived = await archiveOne('client', id, ctx)
// Count / exists
const total = await count('task', { status: 'OPEN' }, ctx)
const alreadyExists = await exists('client', { pan: 'ABCDE1234F' }, ctx)Branch scoping
Automatic. If a model has isBranchScoped: true, ctx.branchId is injected into every where clause. Cross-branch leakage is impossible without explicit adminOverrideBranch: true:
// Admin-only: bypass branch scope
const allItems = await listMany('client', req.query, ctx, { adminOverrideBranch: true })Cursor pagination
Pass a cursor (record ID) to switch from offset to cursor mode. Cursor mode skips the expensive COUNT(*) query — ideal for infinite scroll, large datasets, and real-time feeds.
// First page — no cursor needed, use offset
const page1 = await listMany('task', { limit: 20, sortBy: 'createdAt', sortOrder: 'desc' }, ctx)
// Next page — pass the last item's ID as cursor
const page2 = await listMany('task', {
cursor: page1.meta.type === 'offset' ? page1.data[page1.data.length - 1].id : page1.meta.nextCursor,
limit: 20,
sortBy: 'createdAt',
sortOrder: 'desc',
}, ctx)
// page2.meta.type === 'cursor'
// page2.meta.hasMore === true/false
// page2.meta.nextCursor === 'id-of-last-row' or null
// Backward pagination
const prevPage = await listMany('task', {
cursor: 'some-id',
cursorDirection: 'backward',
limit: 20,
}, ctx)When to use which:
| Mode | Use case | Cost |
|------|----------|------|
| Offset (page/limit) | Admin tables, dashboards with page numbers | findMany + COUNT(*) |
| Cursor (cursor) | Infinite scroll, mobile feeds, large datasets | findMany only (2x faster) |
Multi-column sort
Sort by up to 3 fields. Missing sort orders default to 'asc':
// Single field (unchanged)
const result = await listMany('client', { sortBy: 'displayName', sortOrder: 'asc' }, ctx)
// Multi-column: primary + secondary sort
const result = await listMany('client', {
sortBy: ['status', 'createdAt'],
sortOrder: ['asc', 'desc'],
}, ctx)
// → ORDER BY status ASC, createdAt DESC
// Missing orders pad with 'asc'
const result = await listMany('task', {
sortBy: ['priority', 'dueDate', 'createdAt'],
sortOrder: ['desc'], // only first specified
}, ctx)
// → ORDER BY priority DESC, dueDate ASC, createdAt ASCArchive strategies
Configure per model in ModelConfig.archiveStrategy:
// Status field: sets status column to specified value
{ type: 'status', field: 'status', value: 'CHURNED' }
// Boolean: sets isActive to false
{ type: 'is_active', field: 'isActive' }
// Timestamp: sets closedAt to now()
{ type: 'closed_at', field: 'closedAt' }Lifecycle hooks
import { registerHooks, normalizePAN } from '@pinaka/core'
registerHooks('client', {
beforeCreate: async (data) => ({
...data,
pan: normalizePAN(data.pan as string),
gstin: normalizeGSTIN(data.gstin as string),
}),
afterCreate: async (client, ctx) => {
// e.g. create initial compliance calendar
},
beforeUpdate: async (id, data, ctx) => {
// e.g. set closedAt when status → COMPLETED
if (data.status === 'COMPLETED' && !data.closedAt) {
return { ...data, closedAt: new Date() }
}
return data
},
})Error handling
import {
AppError, ValidationError, NotFoundError, ConflictError,
expressErrorHandler
} from '@pinaka/core'
// Throw typed errors
throw new NotFoundError('Client not found', { clientId: id })
throw new ConflictError('PAN already registered')
throw new ValidationError('Invalid GSTIN format', { field: 'gstin' })
// Mount error handler last
app.use(expressErrorHandler)
// Response shape:
// { success: false, errorCode: 'NOT_FOUND', message: 'Client not found', meta: { clientId: '...' } }Auth helpers
import { signToken, verifyToken, extractBearerToken, extractContextFromToken, hashPassword, comparePassword } from '@pinaka/core'
// Sign
const token = await signToken(
{ sub: user.id, roleId: user.roleId, roleCodes: ['CA'], branchId: user.branchId },
process.env.JWT_SECRET!,
'1h',
)
// Verify
const payload = await verifyToken(token, process.env.JWT_SECRET!)
// throws UnauthorizedError on invalid/expired token
// In auth middleware
const token = extractBearerToken(req.headers.authorization)
const ctxFields = await extractContextFromToken(token, process.env.JWT_SECRET!)
// Password hashing
const hash = await hashPassword(plainPassword) // bcrypt, 12 rounds
const valid = await comparePassword(plain, hash)Audit logging
V1: audit events are written as structured Pino log lines with action: "AUDIT".
import { logAuditEvent, buildAuditDiff } from '@pinaka/core'
// Automatic for create/update/archive when auditEnabled: true in ModelConfig
// Manual usage:
await logAuditEvent({
entityType: 'invoice',
entityId: invoice.id,
action: 'CREATE',
newValues: invoice,
userId: ctx.userId,
branchId: ctx.branchId,
requestId: ctx.requestId,
})
// Diff helper for UPDATE events
const { old, new: next } = buildAuditDiff(before, after, ['passwordHash'])To upgrade to DB-backed audit (V2): implement the logAuditEvent body to write to your audit_log table.
Utilities
import { normalizePAN, normalizeGSTIN, normalizeEmail, omitHidden, maskValue } from '@pinaka/core'
normalizePAN(' abcde1234f ') // → 'ABCDE1234F'
normalizeGSTIN(' 29ABCDE1234F1Z5 ') // → '29ABCDE1234F1Z5'
normalizeEmail('[email protected]') // → '[email protected]'
omitHidden(record, ['passwordHash', 'storagePath'])
maskValue('ABCDE1234F') // → 'AB******4F'Response helpers
import { successResponse, paginatedResponse, emptyResponse } from '@pinaka/core'
// Single result
res.json(successResponse(client))
// Paginated list (works with both offset and cursor modes)
res.json(paginatedResponse(await listMany('client', req.query, ctx)))
// Offset: { success: true, data: [...], meta: { type: 'offset', page, limit, total, totalPages, ... } }
// Cursor: { success: true, data: [...], meta: { type: 'cursor', limit, nextCursor, hasMore } }
// Empty (delete/archive)
res.status(204).json(emptyResponse())Domain repository wrappers
import { clientRepository, taskRepository, engagementRepository } from '@pinaka/core'
// Pre-built wrappers with injected filters
const activeClients = await clientRepository.listActiveClients(req.query, ctx)
const myTasks = await taskRepository.listMyOpenTasks(req.query, ctx)
const engagements = await engagementRepository.listClientEngagements(clientId, req.query, ctx)Testing
# Unit tests (no database)
npm test
# Integration tests (requires database setup)
npx prisma generate --schema=tests/prisma/schema.prisma
npx prisma db push --schema=tests/prisma/schema.prisma
npm run test:integration
# All tests
npm run test:allSensitive field registry
Single source of truth for all sensitive field definitions. Add a field once — it propagates to logger redaction, audit exclusion, and response hiding automatically.
import {
SENSITIVE_FIELD_REGISTRY,
getAllSensitiveFields,
getAuditExcludeFields,
getLogRedactionPaths,
getSensitiveFieldsByCategory,
} from '@pinaka/core'
// View all categories
console.log(SENSITIVE_FIELD_REGISTRY)
// {
// auth: ['password', 'passwordHash', 'token', 'accessToken', 'refreshToken', 'secret', ...],
// identity: ['pan', 'gstin', 'aadhaar'],
// storage: ['storagePath', 'storageKey'],
// http: ['authorization', 'cookie'],
// }
// Flat list of every sensitive field
getAllSensitiveFields() // → ['password', 'passwordHash', ..., 'cookie']
// Fields excluded from audit diffs (auth + identity)
getAuditExcludeFields() // → ['password', ..., 'aadhaar']
// Pino redaction paths (top-level + wildcards + HTTP headers)
getLogRedactionPaths() // → ['password', '*.password', 'headers.authorization', ...]
// By category
getSensitiveFieldsByCategory('auth') // → ['password', 'passwordHash', ...]
getSensitiveFieldsByCategory('identity') // → ['pan', 'gstin', 'aadhaar']Adding a new sensitive field
Edit src/security/sensitive-fields.ts and add the field to the appropriate category. All consumers (logger, audit, model registry) automatically pick it up — no other changes needed.
Environment variables reference
See .env.example for a complete reference.
| Variable | Required | Default | Description |
|---|---|---|---|
| DATABASE_URL | yes | — | PostgreSQL connection string |
| NODE_ENV | no | development | Controls log format |
| LOG_LEVEL | no | debug/info | Pino log level |
| SERVICE_NAME | no | @pinaka/core | Log service field |
| SLOW_QUERY_MS | no | 200 | Slow query log threshold (ms) |
| JWT_SECRET | yes (for auth) | — | HS256 JWT signing secret |
| JWT_EXPIRES_IN | no | 1h | Default token expiry |
