@adia-ai/a2ui-compose
v0.5.6
Published
AdiaUI A2UI compose engine — framework-agnostic. Takes natural-language intents + a catalog and produces A2UI protocol messages. Pairs with `@adia-ai/a2ui-retrieval` (intent classification, catalog lookup) and `@adia-ai/a2ui-validator` (schema + semantic
Readme
@adia-ai/a2ui-compose
Framework-agnostic UI generation engine. Constitution doc:
docs/specs/compose-constitution.md.
Takes a natural-language intent +
an A2UI component catalog and produces a tree of A2UI protocol messages
ready for a renderer.
This package is pipeline runtime only. UI components live in
@adia-ai/web-components; the A2UI runtime (renderer, registry, streams, wiring) in@adia-ai/a2ui-runtime; the pattern corpus in@adia-ai/a2ui-corpus; the MCP server in@adia-ai/a2ui-mcp.Published to the public
@adia-aiscope on 2026-04-24 alongsidea2ui-corpus,a2ui-mcp,a2ui-retrieval, anda2ui-validator.
Install
npm install @adia-ai/a2ui-composeTypically paired with @adia-ai/a2ui-corpus (the pattern corpus the engine reads from) and @adia-ai/llm (the LLM client; runtime peer-dep):
npm install @adia-ai/a2ui-compose @adia-ai/a2ui-corpus @adia-ai/llmWhat it does
intent ─▶ classify ─▶ retrieve ─▶ compose / adapt ─▶ validate ─▶ A2UI
(concepts) (patterns) (engine-specific) (score ≥70) JSONOne entry point, two generation strategies, pluggable LLM back-end.
import { generateUI } from '@adia-ai/a2ui-compose/core';
const result = await generateUI({
intent: 'login form with email, password, and remember-me',
engine: 'zettel', // 'monolithic' | 'zettel'
mode: 'pro', // monolithic only: instant | pro | thinking
model: 'claude-sonnet-4-7',
});
// result.components — A2UI message array
// result.validation — { score, checks, warnings }
// result.debug — pattern matches, LLM prompt, token usage, …Layout
a2ui-compose/
├── core/ orchestrator — state, dispatch, pipeline (formerly engine/)
│ ├── generator.js generateUI() — the one public entry point
│ ├── state.js ArtifactStore + PipelineEngine singletons
│ └── pipeline/ 6-stage pipeline engine
│
├── strategies/ pluggable engines via registerEngine() (formerly engines/)
│ ├── registry.js engine selector + reserved-name guard
│ ├── monolithic/ pattern-match + LLM-adapt (3 modes)
│ │ ├── generate-instant.js no LLM — pattern-match only
│ │ ├── generate-pro.js pattern + LLM adaptation, non-streaming
│ │ └── generate-thinking.js streaming LLM + repair loop
│ └── zettel/ fragment-graph composition
│ ├── generator-adapter.js entry point
│ ├── composer.js assembles fragments → compositions
│ └── session-store.js multi-turn state (Phase A)
│
├── retrieval/
│ ├── catalog.js loads component schemas from sibling .a2ui.json
│ ├── pattern-library.js keyword-ranked pattern search (corpus + embeddings)
│ ├── fragments.js atomic-shape lookup for zettel
│ ├── anti-patterns.js catalog of canonical anti-patterns
│ └── feedback-store.js accumulates user feedback → disk
│
├── llm/
│ ├── llm-bridge.js unified adapter (Anthropic / OpenAI / Gemini)
│ ├── env.js Vite + Node env-var routing
│ └── prompts/ system prompts per engine mode
│
├── validation/
│ └── validator.js 15-check A2UI validator, weighted 0–100 score
│
├── intelligence/ intent classification + concept extraction
│ ├── classifier.js
│ ├── concepts.js
│ └── steelman.js
│
└── evals/
└── harness.mjs held-out intent benchmark runnerEngines
Monolithic — pattern-match against full-canvas templates, optionally adapt via LLM. Three modes:
| Mode | LLM? | Speed | Use for |
|------------|------|--------|-------------------------------------------------|
| instant | no | <50ms | High-confidence intents with exact pattern hit |
| pro | yes | ~2s | Most requests — adapt a template to the intent |
| thinking | yes | ~5s | Complex requests; streams + runs a repair loop |
Zettel — fragment-graph composition. Retrieves atomic fragments (form-field, card-header, action-row, …) by keyword + concept-tag overlap and assembles them into compositions. Verbatim retrieval above a threshold (score ≥ 40); LLM synthesis from fragments below. Preserves session state across multi-turn iterations.
import { registerEngine } from '@adia-ai/a2ui-compose/strategies/registry';
registerEngine('my-engine', async (ctx) => {
// ctx: intent, catalog, patterns, concepts, session, llm, …
return { components: [...], validation: {...}, debug: {...} };
});Reserved names: monolithic, monolithic-*, zettel, mcp.
Validation
Every generated output runs through validation/validator.js — 15 weighted
checks covering structural validity, card/grid conventions, intent
alignment (F1), and anti-patterns. Result:
{
score: 92, // 0-100
passed: true, // score ≥ 70
checks: [
{ id: 'structure', ok: true, weight: 10 },
{ id: 'intent-f1', ok: true, weight: 8, value: 0.84 },
{ id: 'card-grid', ok: true, weight: 6 },
{ id: 'anti-pattern', ok: false, weight: 4, hit: 'chart-legend' },
…
]
}_fallback surfaces score 0 by design — ensure the engine returns real
output, not a safety net.
LLM bridge
Multi-provider adapter with a common interface:
import { getAdapter } from '@adia-ai/llm';
const adapter = getAdapter('anthropic'); // or 'openai', 'gemini'
const stream = await adapter.streamChat({ model, messages, tools });Env-var routing via llm/env.js — works under Node (process.env) and
Vite (import.meta.env). Browser calls proxy through server.js at the
repo root (holds API keys); Node calls go direct.
Evals
npm run evals # held-out intent benchmark
npm run eval:diff -- --engine zettel # diff against baselineThe held-out fixture lives in gen-ui-training/evals/held-out.jsonl.
Regression thresholds the pipeline must hold:
- Zettel coverage 100%, avgScore ≥ 88, MRR ≥ 0.94
- Monolithic coverage 100%, avgScore ≥ 95
- Fragment reuse ratio ≥ 29.9% (167 refs / 559 nodes)
Full gate sweep: see AGENTS.md at repo root, or run /verification-sweep.
Gotchas
- Component catalog is read-only.
.a2ui.jsonsidecars inweb-components/components/*/are build outputs; edit the sibling YAML instead. - Zettel loading is lazy. The corpus is only parsed on first zettel
call — avoids Node
fs/pathimports reaching the browser bundle. - Validator score ≥ 70 is required for downstream consumers to trust the output. Below that, callers should treat the tree as advisory.
- Engine name reservations are enforced at registration time —
registerEngine('zettel-v2', …)passes;registerEngine('zettel', …)throws.
License
MIT
