npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

typizator-handler

v2.1.1

Published

Database facade and handler converting JSON events to strict types for AWS lambdas and similar applications

Downloads

744

Readme

Runtime types and metadata schemas for Typescript

Coverage npm version Node version

Purpose

Well-typed database facade and clean converting of JSON parameters for AWS lambdas and similar applications

Installing

npm i typizator-handler

Documentation

This library provides AWS lambda handlers to implement API methods defined by typizator schemas. It is essentially a set of utilities used to implement connected AWS lambda functions that are created with a set of CDK utilities managed in the cdk-typescript-lib.

It also defines a Postgres database facade to make requests using the same runtime type schemas.

AWS Lambda handlers

Imagine you want to implement on the AWS backend an API that later can be called from the client (or from other backends).

You use typizator and you define an API to serve. For example like that:

const api = apiS({
    helloWorld: {
        args: [stringS.notNull], retVal: stringS.notNull
    }
})

typizator will translate it to:

{
    helloWorld: (arg0:string)=>string
}

In the microservices logic it's good to implement each function of the interface (actually, we have only one, the helloWorld) with a separate lambda function. But we don't want the headache of arguments and return types conversion, it would be good to make it work out of the box. Here is where this library helps. It lets you define a handler like this:

export const helloWorld = 
    // This is the function from this library
    lambdaConnector(
        // We take the endpoint schema from the API we defined earlier. It ensures type checks and conversions
        api.metadata.implementation.helloWorld
        // This is the name of the implementation function. Typescript will only allow arguments and returned types defined by the endpoint schema
        helloWorldImpl
    )

The implementation can be whatever you want, but it has to match the signature defined by the schema (the first argument is not used if you don't have any connected resources):

const helloWorldImpl = async (_:HandlerProps, arg:string) : Promise<string> => {
    // Your implementation here
}

It becomes even more interesting if you want to connect a Postgres database (sitting on AWS RDS for example) and use it from your lambda. You just have to replace your handler by:

export const helloWorld = 
    // This is the other function from this library
    lambdaConnector(
        // We take the endpoint schema from the API we defined earlier. It ensures type checks and conversions
        api.metadata.implementation.helloWorld
        // This is the name of the implementation function. Typescript will only allow arguments and returned types defined by the endpoint schema
        helloWorldImpl,
        // This tells the connector that it needs to inject the active database connection to the handler
        { databaseConnected: true }
    )

That's it, your helloWorldImpl is connected to the database resource. You just have to slightly change its definition:

const helloWorldImpl = async (props:HandlerProps, arg:string) : Promise<string> => {
    // Your implementation here
}

When the function is called, you receive the pg library facade to talk to your database. Some pleasant features of that facade will be detailed below.

But wait a second. Connection to what database? We didn't seem to have configured any access till now? Well, this is simply done by the environment variables in process.env that you can define when you configure your AWS lambda function:

  • DB_ENDPOINT_ADDRESS has to contain the full URI to your database
  • DB_NAME is the database's name available at the endpoint defined by the previous variable
  • DB_SECRET_ARN is the AWS secret's ARN where the database password is stored. We don't store our passwords in clear anywhere

All this is configured automatically if you use the cdk-typescript-lib library to integrate all this story with the CDK. Why it is separated from this library? Simply because you don't want your lambdas to know anything about the details of their own deployment via CDK, it's not their concern. All they need are the type conversions, the resources connections and the handlers for that. And this is exactly what this library provides.

Custom error treatment

Note that you can pass as the third parameter of any handler a function that you can use as an error logger. It will be called on every uncaught exception that can occur in your implementation code:

const errorHandler = async (error: any, props: HandlerProps, metadata: NamedMetadata) => {
    // Your implementation here
}

lambdaConnector(
    api.metadata.implementation.helloWorld
    helloWorldImpl,
    {
        databaseConnection: false,
        // This function can be shared across your implementations
        errorHandler
    }
)

Note that if your handler is a connected one, this function will receive a database connection information in props, so that you can for example record the error information in a table if you need it. The metadata parameter receives the name and the API path of the function that have thrown the error.

Firebase admin connector

If your application needs to send push notifications to your mobile apps with Firebase, you can do it by requesting the connection to Firebase to be injected into your handler. You just have to do this:

lambdaConnector(
    api.metadata.implementation.helloWorld
    helloWorldImpl,
    {
        firebaseAdminConnected: true
    }
)

In that case, your handler receives the connector object in HandlerProps and you can call it in your lambda when you need to send push messages:

await props.firebaseAdmin?.sendMulticastNotification?.(
    "Message title", 
    "Message body", 
    // List of push tokens you receive from your client applications
    [TOKEN1, TOKEN2])

The function returns a standard BatchResponse that you can use as specified in Firebase documentation.

Database connection helpers

As we allow our lambdas to connect to databases (Postgres only for now, but nobody prevents us from adding support to other ones in the near future), it would be good to communicate to that database without the headache given by the fact that SQL and Typescript don't share the same types system and sometimes getting an objects list from an SQL query can be... how to say... unpredictable...

The database client connection is exposed through the DatabaseConnection interface that is passed to your lambda through the connected handler described above. Or otherwise you can directly create if from the pg connected client by calling the connectDatabase factory function from this library.

You can still access the original pg client through the interface's client property. There is also the query shortcut that executes a simple query on the database, returning the data in row mode. Refer to the pg library if you forgot what it is.

Now, let's look at interesting things. Imagine you have in the database a table named test_table containing two fields, test_id that is a BIGINT and test_name that is a VARCHAR(255). This structure can be defined using typizator as

const testTableS = objectS({
    testId: bigintS.notNull,
    testName: stringS.optional
}).notNull
type TestTable = InferTargetFromSchema<typeof testTableS>

TestTable will be automatically inferred as

{
    testId:bigint,
    testName:string | null | undefined
}

Notice that we follow the camel case convention for the fields names, the library takes care of conversions.

Now if from our interface we to await connection.select(testTableS, "test_table"), it looks into the testTableS schema and creates a query like this:

SELECT test_id, test_name FROM test_table

The call will return an array of TestTable, all types safely converted.

You can exclude some of the schema's fields from the query using the optional overrides parameter that (for now) allows to ignore one or more schema's fields. For example, you can modify the call above:

connection.select(testTableS, "test_table", [], { testName: { action: "OMIT" }})

...like that, the test_name field will not be included in the request.

A variation of this method is the typedQuery. The only difference between them is that typedQuery doesn't create the SELECT statement on the fly, it requires the full SQL query as the second argument. The first argument is still the typizator schema definition, we need it to correctly type the rows returned from the query.

For typedQuery it is possible to pass a primitive (like stringS) as a first argument, in that case we suppose that the query result will have one column (the other eventual columns are ignored) and it will return the array of primitives of a corresponding target type of the schema.

The multiInsert function allows to insert (in one query) up to 1000 rows to the table at the same time. For example:

const idsAndNames = [
    { testId: 1n, testName: "One" },
    { testId: 2n, testName: "Two" }
]
connection.multiInsert(testTableS, "test_table", idsAndNames)

There is also a multiUpsert function that acts exactly like multiInsert but allows to define what happens if you try to insert a row generating a key conflict. For example:

connection.multiInsert(
    testTableS, 
    "test_table", 
    [{ testId: 1n, testName: "One" }]
)
connection.multiUpsert(
    testTableS, 
    "test_table", 
    [{ testId: 1n, testName: "One modified" }],
    {
        upsertFields: ["testId"],
        onConflict: ActionOnConflict.REPLACE
    }
)

In this case, if testId is a unique key field, the second call will update the row by changing the value of testName to a modified value.

Instead of REPLACE, you can also use IGNORE in which case the conflicting updates are simply ignored or REPLACE_IF_NULL that only lets update the fields that are null before the upsert call.

Both multiInsert and multiUpsert accept action definitions similar to "OMIT" for the select function. In addition, you can set the action to "NOW" for date fields (it will set the corresponding field to the current server timestamp) and to "COUNTER" for number fields, in that case you have to add next to action the sequenceName field naming the database sequence object that will be used to fill the corresponding field. If you want to replace the field value by the result of any other SQL function, use the "FUNCTION" action and put the function into the sql field.

Security context

Handlers can be run in a security context driven by the environment parameters.

Setting the IP_LIST environment variable for your lambda to the JSON string representing a list of authorized IP addresses (for example, ["10.0.0.1"]) limits the access to the handler's implementation to those IP addresses only.

Setting the ACCESS_MASK lets you implement the access checking function that you pass in the properties to your lambdaConnector. This function takes as arguments the handler's properties (first of all, for the database access), the security token sent by the client and the access rights context containing the number set as the ACCESS_MASK environment variable for the lambda. to give a simple example:

const authenticator = async (props:HandlerProps, securityToken: string, access: AccessRights) => {
    // The following call should be implemented by you to check the security token agains the database
    // and return the numeric mask of access rights that match that token
    const maskToCheck = await getServerMask(props, securityToken)
    return (maskToCheck & access.mask) !== 0
}

lambdaConnector(
    api.metadata.implementation.helloWorld
    helloWorldImpl,
    {
        databaseConnected:false,
        authenticator
    }
)

Tests

I recommend to use the @testcontainers/postgresql library to set up database-connected tests in a real environment. To accelerate test suites execution, I recommend to use the jest's --runInBand option and set up your tests suites similar to that:

export const setupTestConnection = (runFirst = async (_: DatabaseConnection) => { }) => {
    jest.setTimeout(60000);
    const setup = {
        connection: null as (DatabaseConnection | null)
    }

    beforeAll(async () => {
        const container = await new PostgreSqlContainer().withReuse().start()
        const client = new Client({ connectionString: container.getConnectionUri() })
        await client.connect()
        setup.connection = connectDatabase(client)
        await runFirst(setup.connection)
    })

    afterAll(async () => await setup.connection!.client.end())

    return setup
}