npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@cdklabs/cdk-construct-connect-datalake

v0.0.0

Published

Construct library for Amazon Connect Data Lake

Readme

Amazon Connect Data Lake CDK Construct

An AWS Cloud Development Kit (CDK) construct that enables access to Amazon Connect analytics data lake. This solution automates the complete Connect Data Lake setup process, eliminating the need for manual configuration or custom CloudFormation templates.

The construct uses a Lambda-backed custom resource to manage the deployment process. It handles associating Connect datasets, accepting RAM resource shares, granting Lake Formation permissions, and creating resource link tables in a centralized Glue database—with support for same-account and cross-account configurations.

Usage

Prerequisites

Installation

Install the construct library in your CDK project directory:

npm install @cdklabs/cdk-construct-connect-datalake
pip install cdklabs.cdk-construct-connect-datalake

Add the following dependency to your pom.xml:

<dependency>
  <groupId>io.github.cdklabs</groupId>
  <artifactId>cdk-construct-connect-datalake</artifactId>
  <version>VERSION</version>
</dependency>
dotnet add package Cdklabs.CdkConstructConnectDatalake
go get github.com/cdklabs/cdk-construct-connect-datalake-go/cdkconstructconnectdatalake

Basic Usage

Add the DataLakeAccess construct to a CDK stack deployed in the same AWS account and region as your Amazon Connect instance.

import { DataLakeAccess, DataType } from '@cdklabs/cdk-construct-connect-datalake';

new DataLakeAccess(this, 'DataLakeAccess', {
  instanceId: 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx', // Your Connect instance ID
  datasetIds: [
    DataType.CONTACT_RECORD,
    'contact_statistic_record',
  ],
});

Important: When deploying alongside a Connect instance in the same stack, add a dependency to the construct:

import { DataLakeAccess, DataType } from '@cdklabs/cdk-construct-connect-datalake';
import { CfnInstance } from 'aws-cdk-lib/aws-connect';

const connectInstance = new CfnInstance(this, 'ConnectInstance', {
  identityManagementType: 'CONNECT_MANAGED',
  instanceAlias: 'my-instance',
  attributes: {
    inboundCalls: true,
    outboundCalls: true,
  },
});

const dataLake = new DataLakeAccess(this, 'DataLakeAccess', {
  instanceId: connectInstance.attrId,
  datasetIds: [DataType.CONTACT_RECORD],
});

// Ensure data lake resources are deleted before the Connect instance
dataLake.node.addDependency(connectInstance);

Cross-Account Configuration

Configure the construct to create data lake resources in a different AWS account by specifying targetAccountId and targetAccountRoleArn. The construct assumes the target role to accept the RAM resource share(s) and create Glue resources in that account.

import { DataLakeAccess, DataType } from '@cdklabs/cdk-construct-connect-datalake';

new DataLakeAccess(this, 'DataLakeAccess', {
  instanceId: 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx',
  datasetIds: [
    DataType.CONTACT_RECORD,
    'contact_statistic_record',
  ],

  // Target account where the resources are created
  targetAccountId: '123456789012', 
  
  // IAM role in the target account for cross-account role assumption
  targetAccountRoleArn: "arn:aws:iam::123456789012:role/RoleName",
});

Multiple Instances

Enable data lake access for multiple Connect instances by creating a separate construct for each. A dependency should be added between them to ensure sequential deployment, preventing conflicts from concurrent operations.

import { DataLakeAccess, DataType } from '@cdklabs/cdk-construct-connect-datalake';

// First Connect instance data lake setup
const dataLake1 = new DataLakeAccess(this, 'DataLakeAccess1', {
  instanceId: 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx',
  datasetIds: [
    DataType.CONTACT_RECORD,
    DataType.AGENT_STATISTIC_RECORD,
  ],
});

// Second Connect instance data lake setup 
const dataLake2 = new DataLakeAccess(this, 'DataLakeAccess2', {
  instanceId: 'yyyyyyyy-yyyy-yyyy-yyyy-yyyyyyyyyyyy',
  datasetIds: [
    DataType.CONTACT_RECORD,
    DataType.CONTACT_FLOW_EVENTS,
  ],
});

// Create dependency to ensure sequential deployment
dataLake2.node.addDependency(dataLake1);

API Reference

DataLakeAccess

The main construct class for setting up Amazon Connect Data Lake integration.

Properties:

  • instanceId (string): Amazon Connect instance ID
  • datasetIds (Array<string | DataType>): Array of dataset IDs to associate. Use DataType enum values or string literals for datasets not yet in the enum.
  • targetAccountId? (string): Target AWS account ID receiving resources (optional)
  • targetAccountRoleArn? (string): IAM role ARN in the target account for cross-account role assumption (optional)

DataType Enum

For a list of supported dataset types, see the API Documentation.

Resources Created

This construct creates the following AWS resources:

Infrastructure Components

  • CloudFormation Custom Resource Provider: Framework for managing custom resource lifecycle

  • Lambda Function: Custom resource handler that orchestrates the data lake setup

  • IAM Role: Execution role with permissions for Connect, RAM, Glue, and Lake Formation operations

    • connect:BatchAssociateAnalyticsDataSet
    • connect:AssociateAnalyticsDataSet
    • connect:BatchDisassociateAnalyticsDataSet
    • connect:DisassociateAnalyticsDataSet
    • connect:ListAnalyticsDataAssociations
    • connect:ListAnalyticsDataLakeDataSets
    • connect:ListInstances
    • ds:DescribeDirectories
    • ram:AcceptResourceShareInvitation
    • ram:GetResourceShareInvitations
    • ram:GetResourceShares
    • glue:CreateDatabase
    • glue:CreateTable
    • glue:DeleteDatabase
    • glue:DeleteTable
    • glue:GetDatabase
    • glue:GetTables
    • lakeformation:GetDataLakeSettings
    • lakeformation:PutDataLakeSettings
    • cloudformation:DescribeStacks
    • sts:AssumeRole (for cross-account setups only)

Deployment Workflow

The construct performs the following steps during deployment:

Deployment Workflow

  1. Dataset Association: Associates the specified datasets for an Amazon Connect instance with the target account
  2. Database Creation: Creates the connect_datalake_database Glue database
  3. Lake Formation Setup: Configures the Lambda execution role (or assumed role for cross-account) as a data lake administrator
  4. Resource Share Acceptance: Accepts the RAM resource share invitation(s). Multiple dataset associations often consolidate into a single RAM resource share
  5. Table Creation: Creates resource link tables for each dataset, enabling queries via Amazon Athena

When deploying to the same account as the Connect instance, all steps execute within that account. For cross-account configurations, steps 2-5 execute in the target account.

Limitations

  • Table Naming: Resource link tables created by this construct are named using the format {datasetId}_{dataCatalogId}
  • Region Support: The construct must be deployed in the same AWS region and account as the Amazon Connect instance. For cross-account configurations, resources are created in the target account within the same region
  • Shared Database: The connect_datalake_database Glue database is shared across all deployments of this construct in an account

Troubleshooting

Partial failures during deployment

  • If some workflow steps fail during create or update operations, the stack deployment will still show as successful. Error details for these partial failures are available in the CloudFormation stack outputs.

RAM resource share has expired

  • Resource shares for new dataset associations can consolidate into existing AWS RAM shares, even if expired. Delete each construct that references the target account, confirm the associated resources are removed, then redeploy using the original construct definitions.

Failure to update Lake Formation permissions due to invalid principal

  • IAM roles that have been deleted but not removed from Lake Formation principals will be considered invalid. Remove the principal causing this error from Lake Formation and redeploy the construct.

Resources are unable to be removed after a Connect instance has been deleted

  • Constructs of this type must be deleted prior to deleting the instance, as cleanup after instance deletion is currently not supported. A GitHub issue can be raised if assistance removing these resources is required.

Support

For issues and questions:

Contributing

We welcome contributions! Please see our Contributing Guide for details.

License

This project is licensed under the Apache-2.0 License.