npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@runnerty/executor-snowflake

v1.0.1

Published

Runnerty module: Snowflake executor

Downloads

17

Readme

NPM version Downloads

Snowflake executor for Runnerty:

Installation:

Through NPM

npm i @runnerty/executor-snowflake

You can also add modules to your project with runnerty-cli

npx runnerty-cli add @runnerty/executor-snowflake

This command installs the module in your project, adds example configuration in your config.json and creates an example plan of use.

If you have installed runnerty-cli globally you can include the module with this command:

rty add @runnerty/executor-snowflake

Configuration:

Add in config.json:

OAuth Authentication (Recommended):

This executor uses OAuth token authentication by default. You need to configure your OAuth token provider and set the following environment variables:

export SNOWFLAKE_ACCOUNT="myaccount.us-east-1"
export SNOWFLAKE_USERNAME="[email protected]"
export SNOWFLAKE_PASSWORD="mypassword"
{
  "id": "snowflake_default",
  "type": "@runnerty-executor-snowflake",
  "account": "@ENV(SNOWFLAKE_ACCOUNT)",
  "username": "@ENV(SNOWFLAKE_USERNAME)",
  "database": "@ENV(SNOWFLAKE_DATABASE)",
  "schema": "@ENV(SNOWFLAKE_SCHEMA)",
  "warehouse": "@ENV(SNOWFLAKE_WAREHOUSE)",
  "role": "@ENV(SNOWFLAKE_ROLE)"
}

Direct Configuration:

{
  "id": "snowflake_default",
  "type": "@runnerty-executor-snowflake",
  "account": "myaccount.us-east-1",
  "username": "[email protected]",
  "database": "MYDATABASE",
  "schema": "PUBLIC",
  "warehouse": "COMPUTE_WH",
  "role": "MYROLE"
}

Configuration params:

| Parameter | Description | | :---------- | :-------------------------------------------------------------- | | account | Snowflake account identifier (e.g., "myaccount.us-east-1") | | username | The Snowflake user to authenticate as. | | database | Name of the database to use for this connection. (Optional) | | schema | Name of the schema to use for this connection. (Optional) | | warehouse | Name of the warehouse to use for this connection. (Optional) | | role | Name of the role to use for this connection. (Optional) | | timeout | Connection timeout in milliseconds. (Default: 60000) | | application | Application name for connection tracking. (Default: "runnerty") |

Plan samples:

Add in plan.json:

Basic query:

{
  "id": "snowflake_default",
  "command_file": "./sql/my_query.sql"
}
{
  "id": "snowflake_default",
  "command": "SELECT CURRENT_TIMESTAMP() as now, CURRENT_USER() as user"
}

Query with parameters:

{
  "id": "snowflake_default",
  "command": "SELECT * FROM users WHERE created_date >= :start_date AND status = :status",
  "args": {
    "start_date": "2023-01-01",
    "status": "active"
  }
}

Export to files:

{
  "id": "snowflake_default",
  "command": "SELECT * FROM sales_data WHERE year = 2023",
  "xlsxFileExport": "./reports/sales_2023.xlsx",
  "xlsxSheetName": "Sales Report",
  "xlsxAuthorName": "Data Team"
}

Generation of files:

The results can be exported to csv, xlsx and json format files. These files are generated using streams for optimal performance with large datasets. You only have to indicate the corresponding property in the parameters:

XLSX

XLSX Format with streaming support for large datasets.

| Parameter | Description | | :------------- | :---------------------------- | | xlsxFileExport | Path of xlsx file export. | | xlsxAuthorName | Author file name. (Optional) | | xlsxSheetName | Name of the sheet. (Optional) |

Sample:

{
  "id": "snowflake_default",
  "command": "SELECT * FROM users",
  "xlsxFileExport": "./exports/users.xlsx",
  "xlsxAuthorName": "Runnerty",
  "xlsxSheetName": "Users Data"
}

CSV

CSV Format with streaming support and customizable options.

| Parameter | Description | | :---------------------- | :-------------------------------------------------------------------- | | csvFileExport | Path of csv file export. | | csvOptions/headers | Type: boolean. Whether to include headers in the CSV. (Default: true) | | csvOptions/delimiter | Alternate delimiter. (Default: ',') | | csvOptions/quote | Alternate quote. (Default: '"') | | csvOptions/rowDelimiter | Specify an alternate row delimiter (i.e \r\n). (Default: '\n') | | csvOptions/escape | Alternate escaping value. (Default: '"') |

Sample:

{
  "id": "snowflake_default",
  "command": "SELECT * FROM users",
  "csvFileExport": "@GV(WORK_DIR)/users.csv",
  "csvOptions": {
    "delimiter": ";",
    "quote": "\"",
    "headers": true
  }
}

JSON

JSON Format with streaming support.

| Parameter | Description | | :------------- | :------------------------ | | jsonFileExport | Path of json file export. |

Sample:

{
  "id": "snowflake_default",
  "command": "SELECT * FROM users",
  "jsonFileExport": "@GV(WORK_DIR)/users.json"
}

Output (Process values):

Standard

  • PROCESS_EXEC_MSG_OUTPUT: Snowflake output message.
  • PROCESS_EXEC_ERR_OUTPUT: Error output message.
  • PROCESS_EXEC_COMMAND_EXECUTED: Executed SQL command.

Query output

  • PROCESS_EXEC_DATA_OUTPUT: Snowflake query output data (array of objects).
  • PROCESS_EXEC_DB_COUNTROWS: Snowflake query count rows.
  • PROCESS_EXEC_DB_FIRSTROW: Snowflake query first row data (object).
  • PROCESS_EXEC_DB_FIRSTROW_[FIELD_NAME]: Snowflake first row field data.

Example of first row field access:

{
  "id": "snowflake_default",
  "command": "SELECT user_id, email, created_date FROM users LIMIT 1"
}

Available variables after execution:

  • PROCESS_EXEC_DB_FIRSTROW_USER_ID: First row user_id value
  • PROCESS_EXEC_DB_FIRSTROW_EMAIL: First row email value
  • PROCESS_EXEC_DB_FIRSTROW_CREATED_DATE: First row created_date value

Authentication Setup:

This executor requires OAuth authentication setup. Make sure you have:

  1. Environment variables configured:

    export SNOWFLAKE_ACCOUNT="your-account"
    export SNOWFLAKE_USERNAME="your-username"
    export SNOWFLAKE_PASSWORD="your-password"
  2. OAuth token provider configured (internal API endpoint)

  3. Required dependencies installed:

    npm install snowflake-sdk axios exceljs fast-csv jsonstream

Features:

  • OAuth Authentication - Secure token-based authentication
  • Streaming Support - Efficient processing of large datasets
  • Multiple Export Formats - XLSX, CSV, JSON
  • Parameterized Queries - Support for :parameter placeholders
  • Environment Variables - Secure configuration management
  • Error Handling - Comprehensive error reporting
  • Connection Pooling - Optimized connection management