npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

penguin-datalayer-collect

v1.0.0

Published

Validação da camada de dados por Google Cloud Function utilizando o module de validação da DP6 penguin-datalayer-core, código responsável processar o json da camada de dados enviados pelo GTM e persistir o resultado da validação no Bigquery

Downloads

19

Readme

penguin-datalayer-collect

O penguin-datalayer-collect é um modulo do ecossitema raf-suite criado pela DP6 para garantir a qualidade dos dados (Data Quality) nos projetos de engenharia de dados implementados nos clientes da DP6, através de monitoramento e pipelines automatizadas de dados.

Ecossistema raft-suite

DP6

Setup penguin-datalayer-collect

1. Requisitos para utilização

1.1 Produtos do GCP

  • Cloud Storage
  • Cloud Function
  • Bigquery
  • Service account

1.2 Dependências ambiente local

  1. Google Cloud SDK
  2. Pacotes zip, unzip e curl
  3. Criar service Account com as permissões (Storage Object Admin, Cloud Functions Admin, BigQuery Admin e Service Account User)
  4. Variável GOOGLE_APPLICATION_CREDENTIALS
  5. Instalar o Terraform

Observação: Utilizando o ambiente no Google Cloud Shell não é necessário fazer os 1 e 4

2. Instalando o penguin-datalayer-core

Clonando o projeto git

git clone https://github.com/DP6/penguin-datalayer-collect.git

Fazer deploy no GCP usando o Terraform

sh terraform_deploy.sh

3. Configurando a tag no GTM

TODO

<script>
	/*
	*Tag responsável por enviar a camada de dados para o Penguin-datalayer-collect
	*/
	analyticsHelper.safeFn('Penguin Datalayer Collect ', function(helper){
		// Array do dataLyer configurado para o GTM
		var body = dataLayer;

		if (helper.cookie('penguin_datalayer_collect') === 'true') {
			var request = new XMLHttpRequest();
			request.open("POST", {{endpoint - penguin-datalayer - collect}} + "?schema="+ {{schema}} , true); // Os dados de validação podem ser enriquecidos com dados de negocios enviados como queryString
			request.setRequestHeader('Content-Type', 'application/json');
			request.onreadystatechange = logHttpResponse;
			request.send(JSON.stringify(body));
		}

		function habilitarAmostragemValidacao() {
			function random(min, max) {
				min = Math.ceil(min);
				max = Math.floor(max);
				return Math.floor(Math.random() * (max - min)) + min;
			}
				var sample = 1;
				var domain = {{Cookie - Domínio}} ? {{Cookie - Domínio}} : 'auto';
				
				if (!helper.cookie('penguin_datalayer_collect')) {
					if (random(0, 100) <= sample) {
						helper.cookie('penguin_datalayer_collect', 'true', {'exdays': 1, 'domain': domain});
					} else {
						helper.cookie('penguin_datalayer_collect', 'false', {'exdays': 1, 'domain': domain});
					}
				}
			}
		}

		function logHttpResponse() {
			if ({{Debug Mode}}) {
				console.log('Penguin-datalayer-collect - Status: ', this.status);
				console.log('Penguin-datalayer-collect - Object dataLayer:', window.dataLayer);
				console.log(JSON.stringify(window.dataLayer));
			}
		}
   });
</script>

4. Enriquecendos os dados com informações de negócio

TODO

5. Criando o dashboard de acompanhamento

TODO

6. Como contribuir

TODO

6.1 Referências

  • https://www.conventionalcommits.org/en/v1.0.0/
  • https://github.com/semantic-release/semantic-release

Power by: DP6 Koopa-troopa Team Suporte: [email protected]