@datatechsolutions/windsock
v1.5.21
Published
Auth platform: browser SDK, types/errors/constants. Auth UI lives in @datatechsolutions/ui/platform/*.
Downloads
3,208
Readme
@datatechsolutions/auth
Full-stack authentication library for the Datatech Solutions platform. Handles JWT signing/verification, session management, MFA, OAuth, RBAC, Lambda authorization, and Next.js integration.
Install
npm install @datatechsolutions/authPeer dependency: @datatechsolutions/shared-domain
Features
- RS256 JWT — RSA token signing/verification via jose
- Session Management — Cookie-based sessions with configurable expiry
- MFA — TOTP (Google Authenticator) + backup codes
- OAuth — Connector-kit pattern with Google, GitHub, Discord, and Microsoft Entra providers via arctic
- RBAC — Role definitions, permission checker, middleware guards
- Lambda Authorizer — API Gateway v2 JWT validation
- Next.js Integration — Middleware, route handlers, React hooks
- Client SDK — Browser AuthClient, React context provider, PKCE utilities
- Token Mappers — 10 OIDC claim mapper types (hardcoded, user-property, audience, realm-role, client-role, group-membership, pairwise-subject, address, script)
- Federation — Registry for external identity providers
- Account Security — Lockout policies, rate limiting, CSRF protection
- TypeORM Adapter — Database adapter for PostgreSQL via TypeORM
- UMA 2.0 — Resource RBAC, policy engine (7 evaluator types × 3 strategies), hierarchical delegation, UMA protocol endpoints (discovery, resource registration, permission tickets, RPT grant)
Passwordless Configuration
Passwordless defaults can be overridden in two ways:
AuthConfig.passwordless(highest priority)- Environment variables (fallback)
Environment variables:
AUTH_PASSWORDLESS_ENABLEDAUTH_PASSWORDLESS_AUTO_REGISTERAUTH_PASSWORDLESS_MAGIC_LINK_TTL_SECONDSAUTH_PASSWORDLESS_OTP_TTL_SECONDSAUTH_PASSWORDLESS_SEND_RATE_LIMIT_MAXAUTH_PASSWORDLESS_SEND_RATE_LIMIT_WINDOW_MSAUTH_PASSWORDLESS_VERIFY_ATTEMPT_MAXAUTH_PASSWORDLESS_VERIFY_ATTEMPT_WINDOW_MS
OIDC Provider Core (Migration Mode)
The package now includes a feature-flagged oidc-provider runtime bridge for protocol-core migration.
Environment variables:
AUTH_OIDC_PROVIDER_ENABLED=true— delegate OIDC endpoints tooidc-providerAUTH_DYNAMODB_TABLE— required for persistent provider model storageAUTH_DYNAMODB_REGION/AUTH_DYNAMODB_ENDPOINT— optional DynamoDB client overrides
Delegated paths when enabled:
/authorize/token/revoke/introspect/userinfo/jwks/par/device/authorize/.well-known/openid-configuration/interaction/*
Entry Points
| Import Path | Description |
|-------------|-------------|
| @datatechsolutions/auth | Core — crypto, session, RBAC, MFA, OAuth, security, errors, types, constants |
| @datatechsolutions/auth/nextjs | Next.js — createAuthMiddleware(), createAuthHandlers(), getSession(), useSession(), useAuth() |
| @datatechsolutions/auth/lambda | Lambda — createAuthorizer(), handler factories (token, OAuth, MFA, account, admin) |
| @datatechsolutions/auth/client | Browser — AuthClient, AuthProvider, useAuth(), usePermission(), useRole(), PKCE |
| @datatechsolutions/windsock/adapters/typeorm | TypeORM adapter for all auth stores |
Architecture Overview
Client (Browser)
│
├─ Login ──→ Auth handler ──→ signJwt(RS256) ──→ Set cookies
│
├─ API call ──→ Authorization: Bearer <jwt>
│ │
│ API Gateway v2
│ │
│ Lambda Authorizer (createAuthorizer)
│ │
│ verifyJwt(RS256) → extract claims
│ │
│ Inject into event context
│
├─ MFA ──→ TOTP challenge ──→ verify code ──→ issue full token
│
└─ OAuth ──→ Google/GitHub ──→ callback ──→ link account ──→ issue tokenLambda Handlers
| Factory | Purpose |
|---------|---------|
| createAuthTokenHandler() | Login, token refresh |
| createAuthOAuthHandler() | OAuth callbacks |
| createAuthMFAHandler() | MFA challenge/verify |
| createAuthAccountHandler() | Account management |
| createAuthAdminHandler() | Admin operations |
| createAuthAdminOrganizationsHandler() | Organization management |
Authorizer Bootstrap With PAT
To enable Personal Access Token authorization in API Gateway authorizer, bootstrap with adapter wiring:
import { createAuthorizerWithAdapter } from '@datatechsolutions/auth/lambda'
import { createTypeOrmAdapter } from '@datatechsolutions/windsock/adapters/typeorm'
const adapter = createTypeOrmAdapter(db)
export const handler = createAuthorizerWithAdapter({
publicKey: process.env.AUTH_PUBLIC_KEY_PEM!,
issuer: 'datatech-auth',
audience: 'fuel-price-ai',
adapter,
})CDK Infrastructure
The cdk/ directory contains AWS CDK stacks for the auth service:
- CoreStack — Aurora PostgreSQL, RDS Proxy, DynamoDB runtime state
- NetworkStack — VPC, subnets, security groups
- ApiStack — API Gateway, Lambda functions
- Custom resources: Ed25519 key generation, CSRF secret, RLS setup
Scripts
| Script | Command | Description |
|--------|---------|-------------|
| build | tsc && tsc-alias | Compile TypeScript |
| watch | tsc -w | Watch mode |
| clean | rm -rf dist | Remove build output |
| test | vitest run | Run all tests |
| test:e2e | Vitest targeted suites | End-to-end auth protocol flows (OIDC/SAML/SCIM critical paths) |
| test:security | Vitest + coverage + JSON report | Security-focused suite with coverage output (coverage/security) |
| test:contracts | vitest run __tests__/contracts | API contract/versioning tests |
| test:synthetic:auth | node scripts/run-auth-synthetic-checks.mjs | Synthetic auth endpoint checks against AUTH_SYNTHETIC_BASE_URL |
| report:security | Node script | Generate security gate summary + enforce coverage thresholds |
| test:coverage:unit | Vitest + coverage | Unit/lambda coverage report (coverage/unit) |
| test:lambda | vitest run --dir __tests__/lambda | Test Lambda handlers |
| test:integration | Migrate + test | Integration tests with real DB |
| test:integration:kms | vitest run --config vitest.integration.config.ts __tests__/integration/kms-jwt.integration.test.ts | KMS RS256 integration simulation (rotation/rollback/failure modes) |
| test:integration:coverage | Docker Compose + Vitest coverage | Integration tests with coverage report |
| coverage:merge | Node script | Merge unit + integration coverage reports (coverage/merged) |
| test:coverage:all | chained script | Run unit coverage + integration coverage + merged summary |
| type-check | tsc --noEmit | Type check only |
| validate:release-policy | node scripts/validate-release-policy.mjs | Enforce versioning policy/release checklist artifacts |
| validate:observability | node scripts/validate-observability-baseline.mjs | Enforce observability baseline artifacts |
| lambda:build | node esbuild.lambda.mjs | Bundle Lambda for deployment |
| dev:db | Docker Compose | Local PostgreSQL + DynamoDB |
| dynamodb:local:init | node scripts/init-dynamodb-local.mjs | Create local DynamoDB table + TTL |
| local:smoke:compose | chained script | Run smoke check against Compose auth-api |
| dev:compose | Docker Compose | Start full local stack including auth-api container on :3002 |
| dev:compose:logs | Docker Compose | Tail auth-api container logs |
| dev:compose:down | Docker Compose | Stop and remove local Compose stack |
| local:integration:compose | chained script | Run full integration checks against Compose auth-api (http://127.0.0.1:3002) |
CI Gate Policy
CI enforces the following suites on pull requests to main/release/**, pushes to main/release/**, and release tags (v*):
| Suite | Command | Gate |
|---|---|---|
| Unit | npm test | Required |
| Integration | npx vitest run --config vitest.integration.config.ts | Required |
| E2E | npm run test:e2e | Required |
| Contracts | npm run test:contracts | Required |
| Release Policy | npm run validate:release-policy | Required |
| Observability | npm run validate:observability + npm run test:synthetic:auth | Required |
| Security | npm run test:security + npm run report:security | Required |
Security gate minimum coverage thresholds:
- Lines:
>= 40% - Functions:
>= 40% - Statements:
>= 40% - Branches:
>= 35%
Release publish (Publish to npm) is blocked unless all suites pass and the security threshold check passes. CI uploads security-tests.json, coverage/security/coverage-summary.json, and security-summary.md as artifacts and writes the security status to the GitHub job summary.
Security governance documents:
SECURITY_THREAT_MODEL.mdSECURITY_TEST_MATRIX.mdKMS_RS256_ROTATION_RUNBOOK.mdJWT_RS256_ROLLOUT.mdAPI_VERSIONING_POLICY.mdRELEASE_CHECKLIST.mdAPI_MIGRATION_TEMPLATE.mdOBSERVABILITY_SLO_SLI.mdALERT_TRIAGE_RUNBOOK.md
Local Lambda + DynamoDB smoke test
This package includes a local HTTP server setup to verify DynamoDB-backed auth components (OIDC adapter, OAuth transaction store, WebAuthn challenge store, and rate limiter) against DynamoDB Local, without SAM.
Prerequisites:
- Docker running
Run step-by-step:
npm run build
npm run dev:db
npm run dynamodb:local:initRun the full local stack in Docker Compose (including auth API container):
npm run dev:compose
curl http://127.0.0.1:3002/local/dynamodb/smokeRun smoke check against Compose API container:
npm run local:smoke:composeRun full integration checks against the Compose API container:
npm run local:integration:composeRun integration tests with code coverage:
npm run test:integration:coverageRun KMS RS256 integration tests:
npm run test:integration:kmsSafe port overrides (optional):
AUTH_DEV_API_PORT=3302 AUTH_DEV_DYNAMODB_PORT=8100 AUTH_DEV_POSTGRES_PORT=55433 npm run dev:composeKMS integration env/setup (staging/manual):
AUTH_JWT_SIGNING_PROVIDER=kmsAUTH_JWT_KMS_KEY_ID=<rsa-kms-key-id>- Runtime must provide
jwtKmsSignerusing AWS KMSSignwithalgorithm=RSASSA_PKCS1_V1_5_SHA_256andmessageType=RAW - Optional rotation overlap:
AUTH_JWT_PREVIOUS_PUBLIC_KEYS='["<old-public-key-pem>"]'
Files used by local integration:
scripts/local-api-server.mjsscripts/local-dynamodb-smoke-handler.mjsscripts/local-auth-runtime.mjsscripts/run-full-integration.mjs
Stripe Integration — Dev Loop
Windsock issues Checkout / Billing-Portal sessions and listens to
customer.subscription.*, checkout.session.*, and invoice.* webhooks to
keep auth_subscriptions in sync. Everything runs against a Stripe test
account in dev.
One-time Stripe setup
- Create a Stripe account (or open your existing one) and switch to Test mode.
- Dashboard → Developers → API keys: copy the Secret key (
sk_test_...). - Dashboard → Product catalog → Add product — create one product per plan code
(
starter,professional,enterprise). On each product setmetadata.plan_code = <code>so/admin/stripe/sync-pricescan match. Add two recurring prices per product — one monthly, one yearly. - Dashboard → Settings → Billing → Customer portal: save the configuration once (features: update payment method, cancel, switch plan). Portal sessions 400 until this is saved.
Env vars (windsock-api)
STRIPE_SECRET_KEY=sk_test_...
STRIPE_WEBHOOK_SECRET=whsec_... # from `stripe listen`, see below
AUTH_APP_URL=http://localhost:3200 # success/cancel/return URL baseIn prod these live in AWS Secrets Manager and are injected into the Lambda task environment.
Run the local webhook loop
# 1. Install the Stripe CLI once
brew install stripe/stripe-cli/stripe
stripe login
# 2. Forward live test-mode events to windsock-api on port 3002
stripe listen --forward-to localhost:3002/stripe/webhook
# → prints a whsec_... — paste it into STRIPE_WEBHOOK_SECRET, restart windsock-apiWith the listener running, every checkout completion / subscription change you trigger in test mode hits your local handler within a second.
Sync prices from Stripe into the DB
After creating Products + Prices in the Dashboard:
curl -X POST http://localhost:3002/admin/stripe/sync-prices \
-H "Authorization: Bearer $(datatech-cli auth token)"The response reports updated[] (plans whose stripe_price_ids were set) and
skipped_products[] (products missing metadata.plan_code or a matching
auth_plans.code). Re-running is safe — it overwrites.
End-to-end test
# Create a Checkout session for the current org
curl -X POST http://localhost:3002/subscription/checkout \
-H "Authorization: Bearer $TOKEN" -H 'Content-Type: application/json' \
-d '{"planCode":"professional","billingInterval":"monthly"}'
# → { "url": "https://checkout.stripe.com/c/pay/..." }
# Pay with test card 4242 4242 4242 4242 — any future exp, any CVC.
# Stripe fires checkout.session.completed → webhook upserts
# auth_subscriptions with stripe_customer_id + stripe_subscription_id.
# Confirm:
curl http://localhost:3002/subscription -H "Authorization: Bearer $TOKEN"
# → { "subscription": { "status": "active", "stripeSubscriptionId": "sub_...", ... } }Events handled by POST /stripe/webhook
| Event | Action |
|---|---|
| checkout.session.completed | Upsert org's subscription with stripe customer/subscription ids + plan |
| checkout.session.async_payment_succeeded | Same path for delayed payment methods (ACH/SEPA) |
| customer.subscription.created / updated | Refresh status + current_period_{start,end} |
| customer.subscription.deleted | Mark canceled |
| invoice.paid / invoice.payment_succeeded | Log (subscription.updated carries state) |
| invoice.payment_failed | Mark past_due |
RFC Compliance Matrix
Status legend:
Implemented= shipped and testedGuarded= implemented with strict production guardrailsPartial= core behavior exists, advanced clauses still pendingPlanned= not implemented yet
| Spec | Feature | Status | Endpoints / Notes |
|------|---------|--------|-------------------|
| OpenID Connect Discovery 1.0 | Discovery document | Guarded | GET /.well-known/openid-configuration |
| RFC 7517 (JWK) | JWKS publication | Implemented | GET /jwks (kid/alg/use included; discovery advertises this URI) |
| OIDC Core | Authorization code flow + PKCE S256 | Implemented | GET /authorize, POST /token (authorization_code), code_challenge_method=S256 |
| OIDC Core | UserInfo | Guarded | GET /userinfo, scope-based claim filtering (openid/profile/email) |
| OIDC Core | Nonce in ID token | Implemented | nonce accepted and propagated to ID token |
| RFC 7662 | Token Introspection | Guarded | POST /introspect and POST /token/introspect |
| RFC 7009 | Token Revocation | Guarded | POST /revoke and POST /token/revoke |
| RFC 8628 | Device Authorization Grant | Guarded | POST /device/authorize, POST /device/verify, GET /device/verify, POST /token (device_code) |
| RFC 8693 | Token Exchange | Implemented | POST /token with strict client auth + per-client grant enablement, supports subject_token_type (access_token, urn:logto:token-type:personal_access_token), optional actor token validation + act claim propagation, requested_token_type (access_token, refresh_token, id_token), and single target resource/audience |
| RFC 9101 | JWT-Secured Authorization Request (JAR) | Guarded | Delegated oidc-provider enables request and request_uri processing; strict mode can require signed request objects (AUTH_OIDC_REQUIRE_SIGNED_REQUEST_OBJECT=true or AUTH_OAUTH_STRICT_RFC_MODE=true) |
| RFC 9126 | Pushed Authorization Requests (PAR) | Guarded | POST /par, request_uri consumption in GET /authorize |
| RFC 9449 | DPoP | Guarded | oidc-provider DPoP enabled and token-exchange path enforces DPoP JWT proof validation (htm/htu/iat/jti/signature), replay detection, and access-token cnf.jkt binding |
| OAuth 2.0 Core | Client Credentials Grant | Implemented | POST /token (client_credentials) |
| OAuth 2.0 Core | Refresh Token Grant | Implemented | POST /token/refresh, POST /token (refresh_token) |
| Provider Core | oidc-provider runtime delegation | Implemented | Feature-flagged via AUTH_OIDC_PROVIDER_ENABLED; persistent models in DynamoDB |
Strict Production RFC Guard
Enable strict guard mode in Lambda environment:
AUTH_OAUTH_STRICT_RFC_MODE=trueWhen enabled, the service enforces confidential-client authentication for sensitive endpoints:
- Introspection
- Revocation (body token mode)
- PAR
- Device authorization start
Optional fine-grained toggles (override strict default behavior):
oauth.requireConfidentialClientForIntrospectionoauth.requireConfidentialClientForRevocationoauth.requireConfidentialClientForParoauth.requireConfidentialClientForDeviceAuthorizeoauth.requireDpopForTokenExchangeoauth.requireSignedRequestObjectForAuthorize
Telemetry and forensics baseline:
- Standardized grant telemetry is emitted to audit store (
telemetry.token.*,telemetry.authorize.*) with:eventType,grantType,clientId,subjectSub,actorSub,organizationId,resource,audience,scope,result,errorCode,latencyMs,requestId,correlationId. - DPoP verification emits forensic events (
telemetry.token.dpop) for success/failure with replay outcome and verification metadata (jkt,jti,iat,htm,htu, reject reason).
Known Gaps for Full Certification
- Token exchange supports access-token and PAT subject token exchange, actor-token validation with
actclaim propagation, andrequested_token_typefor access/refresh/id tokens with single targetresource/audience. - JAR is enabled through delegated provider configuration; production deployments should enforce signed request objects in strict mode.
