npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

service-dal

v1.45.0-alpha.0

Published

Service aim to de-attach PlutoTV infrastructure from pluto-main DB and provide flexibility by utilising the inf-lib-dal library

Downloads

88

Readme

Table of Contents

About The Project

The Data Abstraction Layer abstracts the database drivers, connection, interface, queries, and responses from the pluto-main MongoDB for clients.

Built With

This section should list any major frameworks that you built your project using. Leave any add-ons/plugins for the acknowledgements section. Here are a few examples.

Getting Started

Prerequisites

These are the things which are needed for operating withing this repo:

  • docker - installing docker
  • nvm - installing nvm - this is optional just for comfortable node version switching
  • node and npm - installing
  • AWS credentials to read from the DevOps account (972525491246). This should be made available by being part of a team with tier-based permissions. Please contact your manager if you have questions.
  • ecr-role profile in your AWS config (~/.aws/config) specifying the role associated with your assigned team tier (above). Information can be found here.
    • Replace *Tier* with your assigned team tier (ex., ArchitectureTier1).
    [profile ecr-role]
    role_arn = arn:aws:iam::157385605725:role/*Tier*
    region = us-east-1
    source_profile = default
  • Access to the Pluto TV Private GitHub npm packages.
    1. Create a GitHub personal access token with at least the scope read:packages.
    2. Create a ~/.npmrc file with two lines. Replace {TOKEN} with your generated personal access token.
    @pluto-tv:registry=https://npm.pkg.github.com/
    //npm.pkg.github.com/:_authToken={TOKEN}

Installation

nvm use

Use project defined NodeJS version.

npm i

Install all dependencies for the repo.

cd ./nodejs-client && npm i

Install dependencies for the @pluto-tv/service-dal-grpc-client. And get back in the root cd ..

gRPC Help

Status Response Codes

  • https://developers.google.com/maps-booking/reference/grpc-api/status_codes

RST_STREAM Error Codes

Periodically, we get RST_STREAM errors. Below is a list of the definitions of these errors.

  • https://httpwg.org/specs/rfc7540.html#ErrorCodes

Understanding Protobufs

  • Proto3 Language Guide: This guide describes how to use the protocol buffer language to structure your protocol buffer data, including .proto file syntax and how to generate data access classes from your .proto files.
  • Protobuf Style Guide: This document provides a style guide for .proto files.
    • Proto File Naming: All .proto files must be named in lower_snake_case.proto. This is the widely accepted standard.

      :warning: Note this is different from our nodejs convention.

    • Message Naming: Use CamelCase (with an initial capital) for message names – for example, SongServerRequest.
    • Field Naming: Use underscore_separated_names for field names (including oneof field and extension names) – for example, song_name.

      :warning: Note this is different from our nodejs convention.

    • Versioning and Message Definitions: The following rules must be followed when naming and defining the content of messages.
      • Do not change existing numbered tags for the fields in the messages. This will break the design considerations meant for backward and forward compatibility.
      • Do not remove a field right away if it not being used anymore. Mark it deprecated ([deprecated = true]) and have a timeline to completely remove it, thereby giving the integrated applications time to flexibly remove the dependency on that field.
      • Adding fields is always a safe option as long as you manage them and don’t end up with too many of them.
      • Be aware of the default values for the data types so that new code can work with messages generated by old code.
  • Data Types and Default Values: When a message is parsed, if the encoded message does not contain a particular singular element, the corresponding field in the parsed object is set to the default value for that field. These defaults are type-specific:
    • For strings, the default value is the empty string.
    • For bytes, the default value is empty bytes.
    • For bools, the default value is false.
    • For numeric types, the default value is zero.
    • For enums, the default value is the first defined enum value, which must be 0.
    • For message fields, the field is not set. Its exact value is language-dependent. See the generated code guide for details.
  • GO Generated Code
    • Case-conversion for Go generated code works as follows:
      1. The first letter is capitalized for export. If the first character is an underscore, it is removed and a capital X is prepended.
      2. If an interior underscore is followed by a lower-case letter, the underscore is removed, and the following letter is capitalized.

      :warning: Note that the generated Go field names always use camel-case naming, even if the field name in the .proto file uses lower-case with underscores (as it should). The proto field foo_bar_baz becomes FooBarBaz in Go, and _my_field_name_2 becomes XMyFieldName_2.

Understanding Create API Data Cleanup

Because protobuf files are strictly typed and there are no null defined in protobufs, object communicated over gRPC may have many attributes with values that are empty strings ("") or empty arrays ([]). This data is processed through a function (cleanUpForWrite) that strips out all attributes that are empty strings or empty lists.

How to Use the Generated Clients

Docker based flow

Base commands

npm run docker:login:devOps

To log into the DevOps AWS Account Elastic Container Registry (ECR) so you can pull the PlutoTV base NodeJS image (e.g.: 972525491246.dkr.ecr.us-east-1.amazonaws.com/pluto-node:14.20.0-buster-slim).

npm run docker:build

Uses ./Dockerfile to generate docker image service-dal.

npm run docker:test

Runs a all tests for the service with launching mongodb in a docker.

npm run docker:start

This command runs service-dal in a docker container in a background mode.

nom run docker:dev:mongodb

Start only mongodb in a docker container

npm run docker:stop

Stops mongodb and service-dal containers.

Environment Variables

:warning: Environment variables are case-sensitive.

| Environment Variable | Description | |-------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | PLUTO_DAL_WRITE_TOKENS | [REQUIRED] (String) Comma-separated list of registered auth tokens allowing create, update, delete operations from the DB. | | PLUTO_OPENTELEMETRY_ENDPOINT | Reachable URL of the collector. Used by lib-observability-js. | | PLUTO_OPENTELEMETRY_TRACING_DEBUG | Allow turning on Open Telemetry debug for Tracing. Supported environment variable values: true, TRUE, 1, false, FALSE, 0. Debugging is disabled by default. Used by lib-observability-js. | | USE_MOCK_KUE_RUNNER | Needed to be set when running tests to make it so that inf-lib-schemas does not require a redis instance. Supported environment variable values: true or "". Redis is required by inf-lib-schemas by default. | | outputTarget | If set to STDOUT then the logs will be formatted. PrettyPrint is set to true. Used by lib-observability-js. | | DISABLE_DYNAMIC_CONFIG | Used to disable dynamic configuration; usually inside k8s.

Software Development Lifecycle

Branching

  • All feature branches should be named with the following structure: feature/{JIRA_ISSUE}.
  • All hotfix branches should be named with the following structure: hotfix/{JIRA_ISSUE}.
  • All bugfix branches should be named with the following structure: bugfix/{JIRA_ISSUE}.

An example would be feature/ARCH-1234.

Changelog

Each newly added feature, bugfix, and hotfix should be added to the service CHANGELOG.md under the # New Version heading. A separate changelog is maintained for the NodeJS gRPC client located here. Some changes apply to the service only, some to the client only and others to both.

Commit Messages

All commits in each branch should have the following structure:

{JIRA_ISSUE}: <Text describing the commit purpose>

An example would be: ARCH-1234: Provided updates to the clips API to allow creation of a clip.

Pull Requests

Create a PR to merge the branch to develop. PR title associated with a ticket should correspond to the following pattern:

[TICKET] <Description>

Versioning

Multiple locations need to be updated when versioning the service.

To version prior to a release a utility exists to update the versions listed above where V is new version

V=1.15.1 make set_version

Service Deployment

This project uses Harness for deployment. Generic deployments are run by Harness after detecting a new docker image has been pushed to the DevOps Account ECR. Helm charts included in this Repo provide instructions for the deployment. The environment the service will be deployed to depends upon the tag of the docker image. A new docker image is created with each push to this repo. The tag of the docker image is based on the git branch name with the commit hash.

| Branch/Tag | Sample Image Tag | Harness Target EKS Cluster | |-----------------------------------|------------------------------------------------------------|-----------------------------------------------| | (branch) develop | develop-3cf4cdc5d519beeda8e0926baf4e6b018f1d53bd | Nonprod:k8s-non-prod-main | | (branch) feature/ARCH-1234 | feature-ARCH-1234-3cf4cdc5d519beeda8e0926baf4e6b018f1d53bd | (none - image built only) | | (branch) release/1.1.0 | release-1.1.0-3cf4cdc5d519beeda8e0926baf4e6b018f1d53bd | Preprod:blue/green/tan | | (tag) 1.1.0 | 1.1.0-3cf4cdc5d519beeda8e0926baf4e6b018f1d53bd | Prod:blue/green, Nonprod (all dal namespaces) | | (tag) @pluto-tv/[email protected] | 1.1.0-3cf4cdc5d519beeda8e0926baf4e6b018f1d53bd | Prod:blue/green, Nonprod (all dal namespaces) |

Production

The DAL Service can be deployed to production by creating a tag that conforms to the following pattern:

  • @plutotv/service-dal@{major}.{minor}.{patch}

Sandbox

Follow the below instructions for deploying to sandbox-use1-6 https://plutotv.atlassian.net/wiki/spaces/IS/pages/2587328537/Deploying+to+Sandbox+sandbox-use1-6

Tags are added by creating a GitHub release for the NodeJS Client. A summary of the changelog in the release must be supplied even for prerelease versions. Non-prerelease versions must have a summary of all the changes since the last non prerelease.

NodeJS gRPC Client Publishing

See the NodeJS gRPC Client README.

Running locally

Run locally with real data

In order to run locally with real data one needs to have access to the mongodb personal data set. Instruction, for this data set are included in the readme for the linked repo. In order to do so, one will need to have this or something similar in your aws config in ~/.aws/config. The following role is needed.

[profile ecr-role]
source_profile = default
role_arn = arn:aws:iam::157385605725:role/AssumeRoleECROps

One can activate or log in to assume this role using:

npm run docker:login:root

After this (please be sure that you have a built image locally for service-dal) you would be able to launch the following command which will launch a mongodb with a slice of real data and service running

npm run docker:start

If you want to run directly with npm start script, you can run the following command (or create an alias for it to make it easier to remember):

LOG_LEVEL=debug PLUTO_DAL_WRITE_TOKENS=testToken PLUTO_OPENTELEMETRY_ENDPOINT="http://otel-collector:9411" USE_MOCK_KUE_RUNNER=true npm start

When command finished the DAL service would be accessible on '0.0.0.0:50051' and you would be able to reach out it with option below.

Run tests in command line

:warning: An entry for mongodb needs to be added to your hosts file. This is because we use a mongodb container that is configured to be a replica set with the mongodb address being mongodb.

  127.0.0.1     mongodb

Local tests can be run with the following:

npm run test

Please make sure you have running mongodb in a docker container and listening to 27018 port (npm run docker:dev:mongodb);

Using BloomRPC

Install BloomRPC following instruction on the following page;

As by instructions on the page select proto files from the project, e.g.: services/channels/channels.proto And don't forget to fill the metadata in the tab at the bottom:

{"app_name": "token"}

And you are free to 'play' with the service methods.

Generate gRPC Clients

Instructions on generating gRPC clients or language-specific client files.

:warning: An attempt to update all clients must be made before each merge into develop to ensure client files and server protobufs match.

Generate GoLang Client

First, you need to use the supported version of GoLang specified in the go.mod file. If it is not installed the gvm will ask you to run an install command.

gvm use go1.17

Install the prerequisites for generating the Go gRPC client.

make setup_protoc

Generate the gRPC Go Client files using one of these two options:

  1. npm script.
    npm run client:golang:generate
  2. Directly.
    npm make gen_proto

Generate Java Client

First, you need to have installed any latest version on Java (20.0.1). Second, you need to have installed any latest version of Maven (3.9.1). If it is not installed the codegen will not be able to work.

You can find Java installation files here Java downloads You can find Maven installation files here Maven downloads

Generate the gRPC Java Client files using one of these two options:

  1. npm script.

npm run client:java:generate
  1. Directly.

npm make gen_proto

Generate Nodejs Client

npm run client:nodejs:generate

Generate All Clients

npm run client:generate

Generate gRPC Clients

Instructions on generating gRPC clients or language-specific client files.

:warning: An attempt to update all clients must be made before each merge into develop to ensure client files and server protobufs match.

Generate GoLang Client

First, you need to use the supported version of GoLang specified in the go.mod file. If it is not installed the gvm will ask you to run an install command.

gvm use go1.17

Install the prerequisites for generating the Go gRPC client.

make setup_protoc

Generate the gRPC Go Client files using one of these two options:

  1. npm script.
    npm run client:golang:generate
  2. Directly.
    npm make gen_proto

Generate Nodejs Client

npm run client:nodejs:generate

Generate All Clients

npm run client:all:generate

OpenApi Integration

At this time service-dal exposes a grpc interface and no http gateway. OpenApi integration is done for consistency with other Pluto services and as an informational resource. OpenApi doc creation is ran automatically via GitHub Actions.

You can find DAL SwaggerHub here SwaggerHub service-dal

OpenApi Development

To modify and add an api you should also update the corresponding services/{service}.cfg.yaml file with new endpoint. At this time we extern config instead of annotation as we don't actually serve a http gateway.

Do to limitations with a google.api.Service config.yaml it is necessary to mark an api deprecated in the openapi_config.yaml.

Interacting with swaggerhub requires an api key. You can find this in swaggerhub and should expose in your env as SWAGGERHUB_API_KEY

To generate and upload the swagger docs use make.

make

Contacts