@lapidist/dtif-parser
v1.0.6
Published
Canonical parser and runtime for Design Token Interchange Format (DTIF) documents.
Downloads
873
Readme
@lapidist/dtif-parser
Canonical parser and runtime for the Design Token Interchange Format (DTIF). The package provides the reference pipeline for loading, validating, normalising, and resolving DTIF documents while emitting structured diagnostics for tooling and automation workflows.
Documentation: Using the DTIF parser
Installation
npm install @lapidist/dtif-parserThe package targets modern Node runtimes (v22+) and is published as a native ESM module.
Usage
import { parseDocument } from '@lapidist/dtif-parser';
const result = await parseDocument('tokens.json');
for (const diagnostic of result.diagnostics) {
console.error(`${diagnostic.severity}: ${diagnostic.message}`);
}
const resolved = result.resolver?.resolve('#/color/brand/primary');
console.log(resolved?.value);To flatten tokens, collect metadata, and normalise diagnostics in a single step,
use the parseTokens helper. It loads the document, builds the dependency graph,
and returns resolved token snapshots alongside a flattened view of the document.
import { parseTokens } from '@lapidist/dtif-parser';
const { flattened, metadataIndex, resolutionIndex, diagnostics } = await parseTokens('tokens.json');
for (const token of flattened) {
console.log(token.pointer, token.value);
}Pass onDiagnostic to observe parser diagnostics as they are produced and warn
to intercept non-fatal issues. Both callbacks receive domain DiagnosticEvent
objects, allowing you to format or surface them immediately without waiting for
the promise to resolve.
await parseTokens('tokens.json', {
onDiagnostic: (diagnostic) => {
console.error(diagnostic.message);
},
warn: (diagnostic) => {
console.warn('[warn]', diagnostic.message);
}
});Provide a TokenCache implementation, such as the built-in
InMemoryTokenCache, to reuse flattening and resolution results across runs or
for synchronous parsing with parseTokensSync when your inputs are already
available in memory.
Create a session with createSession to reuse caches, install custom document
loaders, register plugins, or parse multiple collections with shared state.
Document loader limits
The built-in DefaultDocumentLoader enforces a maximum document size to
protect tooling from accidentally loading unbounded payloads. The default limit
is 5 MiB. Provide the maxBytes option when constructing a loader or
session to apply a stricter, positive byte threshold:
import { createSession, DefaultDocumentLoader } from '@lapidist/dtif-parser';
const session = createSession({
loader: new DefaultDocumentLoader({ maxBytes: 256 * 1024 })
});Only finite, positive numbers are accepted. Any non-positive value (such as
0, negative numbers, NaN, or Infinity) is rejected with a RangeError at
loader construction. When a payload exceeds the active limit, loading fails with
a DocumentLoaderError whose reason is MAX_BYTES_EXCEEDED and whose
limit property reflects the enforced cap.
When enabling HTTP(S) loading, configure httpAllowedHosts to restrict which
hosts may be fetched and httpTimeoutMs to bound remote requests. Requests to
non-allow-listed hosts fail with a DocumentLoaderError whose reason is
HTTP_HOST_NOT_ALLOWED, and hung requests are aborted with a TimeoutError
driven by the loader's timeout.
Each pipeline stage emits domain DiagnosticEvent objects instead of throwing.
Results aggregate every diagnostic (including cache hits) so tooling can stream
warnings via onDiagnostic/warn hooks, persist them for later inspection, or
format them with formatDiagnostic.
Node adapter
For Node-based tooling, import the bundled adapter to read DTIF token files from disk with extension validation, formatted diagnostics, and ready-to-use token documents:
import { parseTokensFromFile, readTokensFile } from '@lapidist/dtif-parser/adapters/node';
try {
const result = await parseTokensFromFile('tokens/base.tokens.json', {
onWarn: (message) => console.warn(message)
});
console.log(result.flattened.length);
} catch (error) {
// DtifTokenParseError exposes the normalised diagnostics for reporting
}
const document = await readTokensFile('tokens/base.tokens.json');Architecture overview
createSessioncoordinates the loader, schema guard, normaliser, graph builder, and resolver for each request. Sessions keep caches and plugins in sync across parses.- Domain caches receive
RawDocumentIdentitykeys and ensure decoded bytes, AST snapshots, and flattened token artefacts can be reused safely between runs. - Diagnostic events surface from every stage and persist in token cache entries so warm parses provide the same visibility as cold runs.
- Helper APIs (
parseTokens,parseTokensSync,createMetadataSnapshot, andcreateResolutionSnapshot) layer on snapshot builders without bypassing the session lifecycle.
Development
- Parser guide architecture section documents the current module layout, session lifecycle, and testing conventions that future roadmap work will build upon.
Command line interface
The workspace publishes a dtif-parse binary for quick inspection and CI
pipelines:
dtif-parse tokens/base.tokens.json --resolve "#/color/brand/primary"Use dtif-parse --help for the full list of options and output formats.
License
MIT © Lapidist
