npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

db-migrate-cdk

v0.0.8

Published

L3 CDK construct to run golang migrate database migrations

Downloads

246

Readme

DB Migrate CDK

Docs release npm

Apply schema evolutions/database migrations using the golang migrate cli tool in your CDK stack. The key pattern this enables is the ability to initialise (and update) the schema after the database is provisioned but before your application is deployed and tries to connect.

NOTE: migrate supports a number of database types but currently this construct only supports Amazon RDS Instances (mysql or postgres) and Amazon RDS Aurora Clusters (mysql or postgres).

Construct Design

It copies the sql scripts in your local project to an S3 bucket. It uses an AWS Custom Resource to trigger a Lambda function call that takes the scripts from S3 and runs them against the target database. The lambda function is packaged up as a docker image with the 'migrate' cli bundled in.

db-migrate-cdk design

Cool things you can do with this construct

As part of the CDK stack that provisions your database:

  • Create an initial database schema
  • Continually update the database schema on each 'cdk deploy'
  • Create new database users, roles and manage permissions
  • Populate tables with static/ref data
  • Populate tables with test data
  • Clean and repopulate table data to a known state on each 'cdk deploy'
  • Drop the entire schema and recreate on each 'cdk deploy'

Important Context

To have success with this construct you need to understand how db schema evolution/migration tools like migrate, flyway and liquidbase work. In short they apply an ordered sequence of sql scripts (migrations/evolutions) to a target database. The tool maintains a table in the target database to keep track of what the highest applied script is. Each time it runs it checks to see if there is a 'higher' ordered script it can apply and updates its watermark accordingly.

For more details checkout:

Usage

Step 1

Create your ordered sequence of migration scripts in a folder in your project - see migrations guide. eg in <project_base>/migrations' you might create the file '1_initialize_schema.up.sql' that contains:

CREATE TABLE CUSTOMERS(
   ID       INT              NOT NULL,
   NAME     VARCHAR (20)     NOT NULL,
   AGE      INT              NOT NULL,
   ADDRESS  CHAR (25) ,
   PRIMARY KEY (ID)
);

Step 2

In your CDK Stack you create a 'DBMigrate' construct that looks to run the migration scripts created in step 1 against an RDS/Aurora database that has been provisioned earlier in the CDK Stack.

  // Run our 'migrate' based schema setup on our mysql instance
  const mysqlRdsMigration = new DbMigrate(this, 'MysqlRdsMigrateTask', {
    vpc,
    targetDatabaseInstance: mysqlRdsInstance,
    targetDatabaseName: dbName,
    migrationsFolder: './migrations',
  })
import {
  Stack,
  StackProps,
  CfnOutput,
  Token,
  aws_rds as rds,
  aws_ec2 as ec2,
} from 'aws-cdk-lib'
import { Construct } from 'constructs'
import { DbMigrate, DbMigrateCommand } from 'db-migrate-cdk'

/**
 * Provision a test database and executes some test migrations using
 * the DbMigrate construct.
 */
export class DbMigrateTestStack extends Stack {
  constructor(scope: Construct, id: string, props: StackProps) {
    super(scope, id, props)

    const vpc = new ec2.Vpc(this, 'MigrateVPC')

    // Initial database to create in our database instance
    const dbName = 'init'

    // Mysql RDS instance
    const mysqlRdsInstance = new rds.DatabaseInstance(this, 'MysqlRdsMigrateInstance', {
      vpc,
      allocatedStorage: 20,
      engine: rds.DatabaseInstanceEngine.mysql({
        version: rds.MysqlEngineVersion.VER_8_0,
      }),
      databaseName: dbName,
      instanceType: ec2.InstanceType.of(ec2.InstanceClass.T4G, ec2.InstanceSize.MEDIUM),
    })

    // Run our 'migrate' based schema setup on our mysql instance
    const mysqlRdsMigration = new DbMigrate(this, 'MysqlRdsMigrateTask', {
      vpc,
      targetDatabaseInstance: mysqlRdsInstance,
      targetDatabaseName: dbName,
      migrationsFolder: `./migrations/${dbName}`,
      migrateCommand: DbMigrateCommand.UP,
    })

    new CfnOutput(this, 'MysqlRdsMigrateTaskResponse', {
      value: Token.asString(mysqlRdsMigration.response),
    })
  }
}

Limitations / Other Options

This construct is running 'migrate' via an AWS Lambda function as part of an AWS Custom Resource in AWS Cloudformation. Specifically the lambda function is packaged up as a Docker container that has the 'migrate' cli bundled into the image. This means that your specified migrate command needs to run within the constraits of the Lambda runtime. In particular (as of 11/2022) the command needs to complete within the 15 min runtime limit of AWS Lambda and operate within the compute/memory cap of the lambda runtime.

These constraints should be fine for a large number of usecases. However if not then you might want to consider:

  • Running 'migrate' as a stage in a ci/cd pipeline on a container/vm backed build agent
  • Running 'migrate' as part of the boot sequence in your application on the compute that drives your application
  • Running 'migrate' manually against the target database to do exceptional case scenarios (i.e complex DML changes on large volumes of data that might take over 15 mins to run)
  • Some hybrid model of the options above

Credits

The following blog posts were helpful in pulling this construct together:

License

This project is licensed under the Apache-2.0 License.