zzz-data
v0.0.1
Published
zzz data
Maintainers
Readme
zzz-data
Typed data toolkit for Zenless Zone Zero. This package bundles curated JSON, enums, and interfaces so applications can consume the game encyclopedia without scraping or cleansing spreadsheets themselves.
What You Get
- Structured datasets – JSON exports for agents, Bangboos, W-Engines, Drive Discs, anomalies, and Deadly Assaults under
./data. - TypeScript surface –
dist/index.d.tsexposes discriminated unions, shared enums, and helper constants such asATTRIBUTES,SPECIALTIES, andATTACK_TYPES. - Stable identifiers – Consistent keys generated from the workbook (
data.xlsx) and Hakush scraping pipeline (data-hakush).
Installation
pnpm add zzz-dataThe package is workspace-linked here, but once published it can be installed like any other npm dependency. Node.js 20+ is recommended.
Usage
import type { Agent } from "zzz-data"
import { agents, ATTRIBUTES } from "zzz-data"
type ElectricAgent = Agent & { attribute: typeof ATTRIBUTES.ELECTRIC }
const electricAgents: ElectricAgent[] = agents.filter(
(agent): agent is ElectricAgent => agent.attribute === ATTRIBUTES.ELECTRIC,
)Each resource (agents, bangboos, etc.) is exported as a strongly typed array. Constants live under zzz-data/src/constants, and the individual JSON files can be imported directly via the ./data/* export map if you prefer raw objects.
Scripts
| Command | Purpose |
| ------------------- | ------------------------------------------------------------------------------------------------ |
| pnpm run scrape | Launch Puppeteer, crawl https://zzz3.hakush.in, and store fresh raw assets in data-hakush. |
| pnpm run generate | Parse data.xlsx plus the scraped assets and regenerate the normalized JSON files under data. |
| pnpm run build | Compile the TypeScript sources with tsdown into dist (JS + declaration files). |
| pnpm run release | Publish the package (expects the correct registry auth to be configured). |
Data Pipeline
- Source updates – Refresh raw Hakush payloads with
pnpm run scrapeor editdata.xlsxmanually for quick fixes. - Normalize – Run
pnpm run generateto map localized strings to canonical keys, attach icons, and emit denormalized arrays. - Ship – Execute
pnpm run buildso consumers (including the server workspace package) see the updated exports indist.
Intermediate artifacts generated by the scraper live in data-hakush; the distributable JSON lives in data. Check both directories into source control to keep provenance clear.
Directory Layout
src/
constants/ # ATTRIBUTES, SPECIALTIES, ATTACK_TYPES, FACTIONS
types/ # Agent, Bangboo, W-Engine, Drive Disc, etc. definitions
index.ts # Aggregated exports for constants, types, and JSON data
scripts/
scraper/ # Puppeteer + cheerio crawler for Hakush
generate.ts # ExcelJS workflow that regenerates JSON payloadsUpdating Content
- Prefer adding new columns or sheets to
data.xlsxrather than hand-editing JSON. - If the Hakush DOM changes, adjust the selectors in
scripts/scraperand re-runpnpm run scrape. - Keep PRs focused—include regenerated JSON alongside the source changes so reviewers can verify the pipeline output.
