eggi-ai-db-schema-2
v12.57.2
Published
Type-safe database schema with native PostgreSQL queries for Eggi.AI with direct RDS connection
Downloads
3,599
Maintainers
Readme
eggi-ai-db-schema-2
Type-safe database schema and client for Eggi.AI with AWS RDS integration using native PostgreSQL queries. Perfect for Lambda functions, backend services, and any Node.js application that needs secure, performant database access.
📦 Installation
npm install eggi-ai-db-schema-2🎯 What's New
v12.13.0 - Organizations in Mapping Jobs
- ✅ Organizations Array: Mapping jobs now include which workspaces they were exported to
- ✅ Multiple Organizations: Support for jobs exported to multiple organizations
v12.12.0 - Unlimited Quota Support
- ✅ Null for Unlimited:
nullquotaLimit indicates unlimited quotas (PRO_VERSION with active subscription) - ✅ UTC Period Calculations: Fixed timezone issues - all quota periods use UTC for consistent boundaries
- ✅ Always Returns Both: Daily and monthly quotas always returned for all organizations
v12.0.0 - Native PostgreSQL Migration
🚀 Native PostgreSQL Migration: Complete removal of Drizzle ORM in favor of native PostgreSQL queries for better performance, flexibility, and control.
- ✅ Direct SQL queries using the
pglibrary - ✅ Full type safety with TypeScript interfaces
- ✅ Better performance without ORM overhead
- ✅ More control over query optimization
- ✅ Cleaner code with parameterized queries
📚 Important Documentation
- CHANGELOG.md - Complete version history and changes
- Migration System - Custom migration management system
🗄️ Database Management
Schema Architecture
This package uses a multi-schema PostgreSQL architecture:
publicschema: Core application tables (users,skills,organizations, etc.)linkedinschema: LinkedIn-specific data (accounts,work_experience,education, etc.)monitoringschema: System tracking (mapping_jobs,llm_inference_jobs, etc.)
Custom Migration System
This package uses a sophisticated custom migration system that provides:
- Hash Validation: Detects if local files have been modified after being applied
- Environment Support: Separate dev and prod database connections via AWS Secrets Manager
- Comprehensive Error Handling: Clear error messages and validation
- Unified Interface: Single script for all migration operations
Migration Commands
# Unified migration script
./scripts/migrate.sh dev status # Check dev migration status
./scripts/migrate.sh dev apply # Apply dev migrations
./scripts/migrate.sh dev init # Initialize dev migration system
./scripts/migrate.sh prod status # Check prod migration status
./scripts/migrate.sh prod apply # Apply prod migrations
./scripts/migrate.sh prod init # Initialize prod migration system
# Using npm scripts
npm run migrate:dev:status # Check dev migration status
npm run migrate:dev:apply # Apply dev migrations
npm run migrate:dev:init # Initialize dev migration system
npm run migrate:prod:status # Check prod migration status
npm run migrate:prod:apply # Apply prod migrations
npm run migrate:prod:init # Initialize prod migration systemMigration Files
- Location:
migrations/directory - Format: SQL files with naming pattern
0000_description.sql - Validation: SHA256 hash validation ensures file integrity
- Tracking: Custom
migrations.applied_migrationstable tracks applied migrations
🚀 Quick Start
For Lambda Functions (Recommended Approach)
import { dbService } from "eggi-ai-db-schema-2";
// Initialize database connection at module level for reuse across Lambda invocations
// This ensures optimal connection pooling and prevents race conditions
const dbInitialization = dbService.initialize({
connection: {
databaseUrl: process.env.DATABASE_URL,
},
poolConfig: {
max: 1, // Single connection per Lambda container (optimal for Lambda)
idleTimeoutMillis: 120000,
connectionTimeoutMillis: 30000,
},
});
export const handler = async event => {
// Ensure database is ready (reuses connection across invocations)
await dbInitialization;
// Use high-level database service methods
const mappingJob = await dbService.createMappingJobForLinkedInAccount({
linkedinAccountId: 123,
jobMetadata: { source: "lambda" },
});
const userProfile = await dbService.getUserById(123);
return {
statusCode: 200,
body: JSON.stringify({ mappingJob, userProfile }),
};
};Direct Database Access
import { initializeDb, getDb } from "eggi-ai-db-schema-2";
export const handler = async event => {
try {
// Initialize with smart defaults based on environment
await initializeDb({
poolConfig: {
max: 1, // Lambda: 1, Local: 3, ECS: 5
},
});
const db = await getDb();
// Execute native SQL queries with full type safety
const result = await db.query(
`
SELECT u.id, u.given_name, u.family_name, la.forager_id
FROM public.users u
LEFT JOIN linkedin.accounts la ON u.id = la.user_id
WHERE u.id = $1
`,
[123]
);
const user = result.rows[0];
return {
statusCode: 200,
body: JSON.stringify(user),
};
} catch (error) {
console.error("Database error:", error);
throw error;
}
};Normalized Skills Architecture
import { findOrCreateSkill, addProfileSkills, getPopularSkills } from "eggi-ai-db-schema-2";
// Process LinkedIn profile skills (auto-normalized to lowercase)
const skillIds = await findOrCreateSkills(db, ["JavaScript", "Machine Learning", "Leadership"]);
// Add skills to a profile
await addProfileSkills(db, linkedinAccountId, [
{ skillId: skillIds[0], displayName: "JavaScript" },
{ skillId: skillIds[1], displayName: "Machine Learning" },
]);
// Get analytics
const popularSkills = await getPopularSkills(db, 50);
// Returns most popular skills across all profilesLinkedIn Account Operations
This package provides provider-specific operations for LinkedIn accounts:
Forager Provider Operations:
import {
findLinkedInAccountByForagerId,
findLinkedInAccountByPublicIdentifier,
findOrCreateLinkedInAccountByForagerId,
storeCompleteLinkedInProfileByForagerId,
} from "eggi-ai-db-schema-2";Ghost Genius Provider Operations:
import {
findLinkedInAccountByAcoa,
findOrCreateLinkedInAccountByAcoa,
storeCompleteLinkedInProfileByAcoa,
} from "eggi-ai-db-schema-2";Shared Operations:
import {
findLinkedInAccountByAnyIdentifier, // Works with forager_id, ACoA, or public_identifier
} from "eggi-ai-db-schema-2";LinkedIn Account-Based Relationship Operations
import {
upsertLinkedInAccountRelationshipScore,
getLinkedInAccountRelationships,
getTopLinkedInAccountConnections,
} from "eggi-ai-db-schema-2";
// Create relationship score between LinkedIn accounts
await upsertLinkedInAccountRelationshipScore(db, {
linkedinAccountIdA: 123,
linkedinAccountIdB: 456,
score: 85, // 0-100 relationship strength
modelVersion: "v1.2.0",
analysisType: "MESSAGE_ANALYSER",
mappingJobId: 789,
metadata: {
sources: ["linkedin"],
analysisDate: new Date().toISOString(),
conversationCount: 15,
messageCount: 89,
},
});
// Get all relationships for a LinkedIn account
const relationships = await getLinkedInAccountRelationships(db, 123);
// Get top connections for LinkedIn analysis
const topConnections = await getTopLinkedInAccountConnections(db, 123, 10);For Development
# Clone the repository for development
git clone https://github.com/Eggi-AI-Inc/db-schema.git
cd db-schema
# Set up environment
cp env.example .env
# Edit .env with your AWS credentials and secret ARNs
# Configure connection (Direct RDS via AWS Secrets Manager)
cp env.example .env
# Fill in AWS creds and RDS direct secret ARN
# Generate and apply schema
# Migration commands are now handled by the unified script
./scripts/migrate.sh dev status
./scripts/migrate.sh dev apply📦 Usage Examples
Type-Safe Database Operations
import { getDb, query, queryOne } from "eggi-ai-db-schema-2";
// Get fully-typed database client
const db = await getDb();
// Execute parameterized queries with type safety
const users = await query<User>(
db,
`
SELECT id, given_name, family_name, created_at
FROM public.users
WHERE given_name ILIKE $1
LIMIT $2
`,
[`%John%`, 10]
);
// Query single record
const user = await queryOne<User>(
db,
`
SELECT id, given_name, family_name
FROM public.users
WHERE id = $1
`,
[123]
);
// Insert with RETURNING
const newUser = await queryOne<User>(
db,
`
INSERT INTO public.users (given_name, family_name, created_at)
VALUES ($1, $2, NOW())
RETURNING id, given_name, family_name, created_at
`,
["Jane", "Doe"]
);
// Complex joins
const userProfiles = await query(
db,
`
SELECT
u.id as user_id,
u.given_name,
u.family_name,
la.forager_id,
la.public_identifier,
la.headline
FROM public.users u
LEFT JOIN linkedin.accounts la ON u.id = la.user_id
WHERE u.id = ANY($1::int[])
`,
[[123, 456, 789]]
);🗄️ Schema Overview
| Table | Purpose | Key Fields |
| ------------------------------------------- | ----------------------------------- | ----------------------------------------------------------------------------------------------------------- |
| public.users | Core user accounts | id, given_name, family_name, created_at |
| public.authenticated_users | Authentication records | user_id, cognito_user_id, signup_email |
| linkedin.accounts | LinkedIn platform identifiers | forager_id (nullable), public_identifier, linkedin_identifier_acoa |
| linkedin.work_experience | Professional experience | linkedin_account_id, company_name, title |
| linkedin.education | Educational background | linkedin_account_id, school_name, degree |
| public.contact_infos | User contact information | user_id, type, value, source |
| linkedin.relationships | LinkedIn account relationships | linkedin_account_id_a, linkedin_account_id_b, score |
| public.organizations | Workspaces and teams | id, name, workspace_type, subscription_tier, created_by_user_id |
| monitoring.mapping_jobs | Relationship analysis jobs | id, linkedin_account_id, completed_at |
| monitoring.mapping_job_quotas | Quota tracking (v12.12.0+) | id, authenticated_user_id, organization_id, quota_period, quota_limit (nullable), current_usage |
| monitoring.organization_assignment_jobs | Org assignment tracking (v12.13.0+) | id, mapping_job_id, organization_id, completed_at |
| public.skills | Normalized skills library | id, name, normalized_name |
Supported Platforms
- LinkedIn - Professional networking
- Email/Contact - Communication channels
📚 Documentation
- 🛠️ Local Development - IP whitelisting and local development workflow
- 🗄️ Schema Guide - Database schema documentation and examples
- 🔄 Migration Guide - Database migration workflows
- 📊 API Reference - TypeScript API documentation
🚨 Common Issues & Quick Fixes
AWS RDS SSL Connection Error
Error:
PostgresError: no pg_hba.conf entry for host <IP>, user <USER>, database <DB>, no encryptionQuick Fix:
# SSL issues are now handled automatically by the migration system
./scripts/migrate.sh dev status📖 Full Solution: SSL Fix Documentation | Stack Overflow Reference
🔧 Development Workflow
Prerequisites
- IP Whitelisting: Your IP address must be whitelisted for database access
- AWS Credentials: Configure in
.envfile - Node.js 20+: Required for TypeScript and native pg library
Quick Setup
# 1. Setup environment
cp env.example .env
# Edit .env with AWS credentials and DATABASE_SECRET_ARN
# 2. Install dependencies
npm install
# 3. Setup database connection
./scripts/migrate.sh dev init
# 4. Check migration status
./scripts/migrate.sh dev status
# 5. Apply migrations if needed
./scripts/migrate.sh dev apply🚨 Breaking Changes in v12.0.0
Native PostgreSQL Migration
MAJOR CHANGE: v12.0.0 removes Drizzle ORM entirely in favor of native PostgreSQL queries:
What Changed:
- ❌ Removed: All Drizzle ORM dependencies (
drizzle-orm,postgres) - ❌ Removed: Drizzle query builder syntax (
.select(),.insert(), etc.) - ✅ Added: Native
pglibrary with connection pooling - ✅ Added: Type-safe query helpers (
query,queryOne,execute) - ✅ Added: TypeScript interfaces for all database entities
Migration Impact:
- All queries must be rewritten using native SQL
- Type definitions remain unchanged (still strongly typed)
- Connection management improved with native pooling
- Performance improved by removing ORM overhead
Required Code Updates:
// OLD (v11.x and earlier with Drizzle)
import { getDb, eq, users } from "eggi-ai-db-schema-2";
const db = await getDb();
const user = await db.query.users.findFirst({
where: eq(users.id, 123),
});
// NEW (v12.0.0+ with native SQL)
import { getDb, queryOne } from "eggi-ai-db-schema-2";
import { User } from "eggi-ai-db-schema-2";
const db = await getDb();
const user = await queryOne<User>(
db,
`SELECT id, given_name, family_name, created_at FROM public.users WHERE id = $1`,
[123]
);📖 Migration Guide: See CHANGELOG.md for complete migration instructions.
🎯 Key Features
- 🔒 AWS Secrets Manager Integration: Automatic credential fetching
- 🌐 IP-based Security: Secure database access via IP whitelisting
- 📦 NPM Package: Reusable across multiple services
- 🔄 TypeScript Support: Full type safety with native PostgreSQL
- 🗄️ Schema Validation: Custom migration system with hash validation
- 🚀 LinkedIn Account-Based Relationships: LinkedIn-scoped relationship analysis
- ⚡ Object-Oriented Database Service: Auto-initializing singleton service
- 🧠 Smart Environment Detection: Automatic optimal configuration (Lambda: 1 connection, Local: 3, ECS: 5)
- ⚡️ Native Performance: Direct SQL queries without ORM overhead
📊 Available Scripts
| Script | Purpose |
| ---------------------------------- | --------------------------------------- |
| npm run build | Build TypeScript to dist/ |
| npm run dev | Run example code with tsx |
| npm run format | Format code with Prettier |
| npm run type-check | Run TypeScript type checking |
| npm run clean | Remove dist/ directory |
| npm run pack:test | Test npm package contents |
| Migration Scripts | |
| ./scripts/migrate.sh dev status | Check dev migration status |
| ./scripts/migrate.sh dev apply | Apply dev migrations |
| ./scripts/migrate.sh dev init | Initialize dev migration system |
| ./scripts/migrate.sh prod status | Check prod migration status |
| ./scripts/migrate.sh prod apply | Apply prod migrations |
| ./scripts/migrate.sh prod init | Initialize prod migration system |
| npm run migrate:dev:* | Dev environment migration commands |
| npm run migrate:prod:* | Prod environment migration commands |
| Version Management | |
| npm run update-version | Interactive version updater script |
| npm run version:patch | Increment patch version (1.0.0 → 1.0.1) |
| npm run version:minor | Increment minor version (1.0.0 → 1.1.0) |
| npm run version:major | Increment major version (1.0.0 → 2.0.0) |
🚀 CI/CD Pipeline & NPM Publishing
This package uses GitHub Actions for automated testing and publishing to npm registry with an npm version-based workflow.
📦 Publishing Workflow
The publishing workflow is triggered by:
- Push to main branch with
package.jsonversion changes - Manual workflow dispatch via GitHub Actions UI
🏷️ Publishing a New Version
Method 1: Interactive Script (Recommended)
# Use the interactive version updater
./scripts/update-version.sh
# Follow prompts to enter new version (e.g., 12.0.0)
# Script updates package.json automatically
# Commit and push to main branch
git add package.json
git commit -m "bump version to v12.0.0"
git push origin mainMethod 2: Manual npm Commands
# Increment version using npm
npm version patch # 1.0.0 → 1.0.1 (bug fixes)
npm version minor # 1.0.0 → 1.1.0 (new features)
npm version major # 1.0.0 → 2.0.0 (breaking changes)
# Commit and push to main branch
git add package.json package-lock.json
git commit -m "bump version to v$(node -p 'require("./package.json").version')"
git push origin main🔄 CI/CD Workflow Steps
The automated pipeline performs the following:
Test Job
- ✅ Checkout code
- ✅ Setup Node.js 20 with npm cache
- ✅ Install dependencies (
npm ci) - ✅ Type checking and build (
npm run build) - ✅ Format checking (
npm run format --check)
Publish Job (runs after tests pass, only when version changes)
- ✅ Version change detection (compares current vs previous package.json)
- ✅ Build package (
npm run build) - ✅ Check if version exists on npm (prevents duplicates)
- ✅ Publish to npm registry (
npm publish --access public) - ✅ Create GitHub release with git tag (e.g.,
v12.0.0) - ✅ Generate release notes with package links
📋 Prerequisites
Before publishing, ensure you have:
- ✅ NPM_ACCESS_TOKEN secret configured in GitHub repository settings
- ✅ Package version updated in
package.json(if not using npm version commands) - ✅ Updated
CHANGELOG.mdwith new features/fixes - ✅ All tests passing locally
📊 Monitoring
Monitor publishing progress at:
- GitHub Actions:
https://github.com/Eggi-AI-Inc/eggi.ai-db-schema/actions - NPM Package:
https://www.npmjs.com/package/eggi-ai-db-schema-2
🔧 Package Configuration
The package is configured for:
- Public access on npm registry
- ESM/CommonJS compatibility via package.json exports
- TypeScript definitions included
- Tree-shaking friendly exports
- Minimal bundle size via optimized .npmignore
📦 Package Optimization
To keep the package lightweight and focused:
- ✅ Migration files excluded -
migrations/directory not included in npm package - ✅ Development files excluded - Source TypeScript, config files, and docs excluded
- ✅ Only essential files - Compiled JavaScript, TypeScript definitions, and README
- ✅ No internal tooling - Build scripts and development dependencies excluded
Result: Optimized package size with full TypeScript support and native PostgreSQL functionality
📦 Installation in Projects
Once published, install in your projects:
# Install the package
npm install eggi-ai-db-schema-2
# Use in Lambda functions
import { getDb, queryOne } from "eggi-ai-db-schema-2";
import { User } from "eggi-ai-db-schema-2";🎯 Version Strategy
- Patch (1.0.x): Bug fixes, security updates
- Minor (1.x.0): New features, backward compatible
- Major (x.0.0): Breaking changes, API changes
🔐 Direct Database Access
This project uses public RDS access with security controls:
- IP Whitelisting: Your IP must be added to infrastructure allowlist
- SSL Encryption: All connections require SSL/TLS
- Direct Connection: Use any SQL client (pgAdmin, DBeaver, etc.)
- CI/CD Ready: GitHub Actions work automatically
📈 Architecture
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Application │ │ @eggi-ai/ │ │ PostgreSQL │
│ Services │───▶│ db-schema │───▶│ Database │
│ │ │ (Native pg) │ │ (IP Whitelisted) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│
▼
┌─────────────────┐
│ AWS Secrets │
│ Manager │
└─────────────────┘🆘 Need Help?
- Setup Issues: Check Local Development Guide
- Schema Questions: Review Schema Documentation
- Migration Issues: See Migration Guide
- API Usage: Check API Reference
🎯 Result: Production-ready database schema with secure IP-whitelisted access, AWS integration, native PostgreSQL queries, and comprehensive TypeScript support.
🆘 Need Help?
- Setup Issues: Check Local Development Guide
- Schema Questions: Review Schema Documentation
- Migration Issues: See Migration Guide
- API Usage: Check API Reference
🎯 Result: Production-ready database schema with secure IP-whitelisted access, AWS integration, native PostgreSQL queries, and comprehensive TypeScript support.
