@lovision/plugin-host
v1.1.0
Published
Main-thread runtime that owns one plugin worker.
Readme
@lovision/plugin-host
Main-thread runtime that owns one plugin worker.
Published from the main branch through npm Trusted Publishing.
Provides:
PluginRuntimeHost—start/call/terminate, version handshake, heartbeat + watchdog, bidirectional RPC dispatch, abort propagation, 8-code error mappingMessageChannelTransport— structuredClone-based postMessage wrapper (ADR-001)HeartbeatMonitor— host-side ticker covering both heartbeat freshness (heartbeat-timeout) and ping/pong watchdog (watchdog-timeout)
Minimal usage
import { PluginRuntimeHost } from "@lovision/plugin-host";
const host = new PluginRuntimeHost({
workerUrl: new URL("./my-plugin.worker.ts", import.meta.url),
pluginId: "demo-plugin",
apiVersion: "1.0",
dispatchTable: {
// host capabilities the worker can call (Step 3 will codegen these)
"ping": () => ({ pong: true }),
},
});
await host.start(); // resolves once init-ack is received
const result = await host.call("echo", { value: 42 });
host.terminate();Tunables
All timing constants are injectable through options so tests can reproduce
heartbeat / watchdog crashes in <100 ms (V2 spec §11.7.1):
| option | default | role |
| --------------------- | ------- | -------------------------------------------------------------- |
| heartbeatIntervalMs | 1000 | how often the worker is asked to send heartbeat |
| heartbeatTimeoutMs | 5000 | crash if no heartbeat received for this long |
| watchdogIntervalMs | 5000 | watchdog tick cadence |
| watchdogTimeoutMs | 200 | per-ping pong reply window |
| maxWatchdogFailures | 3 | cumulative pong failures that crash the worker |
| rpcTimeoutMs | 30000 | default per-RPC timeout |
| initTimeoutMs | 5000 | how long start() waits for init-ack before rejecting |
PluginManager (Step 2)
PluginManager orchestrates multi-plugin install/list/invoke on top of
PluginRuntimeHost. Each invoke() spawns a fresh worker (Figma-aligned
"no state across invocations") and the SDK's __runCommand handler routes
to the plugin's command callback.
import { PluginManager } from "@lovision/plugin-host";
const manager = new PluginManager({ locale: "zh-CN" });
const id = await manager.install({
packageFormat: "instinct-plugin@1",
manifest: { /* V2 §9 manifest */ },
mainBundle: "/* worker source */",
__testWorkerUrl: workerUrl, // Step 8 will wire production bundle->Blob URL
});
manager.list(); // [{ id, manifest, installWarnings }]
manager.commands(); // [{ pluginId, commandId, displayName }]
const result = await manager.invoke(id, "run", { x: 1 });
await manager.uninstall(id); // also terminates any active workerSame-id reinstall replaces the previous record after terminating any
in-flight workers. Validation errors throw ManifestValidationError with
structured errors[] / warnings[] arrays for the install confirm UI.
See docs/lifecycle.md for the host state machine
and on("statusChange", cb) observer contract.
Manifest + Bundle loaders (Step 2)
import { validateManifest, loadBundle } from "@lovision/plugin-host";
validateManifest(rawJson);
// -> { ok: true, manifest, warnings } | { ok: false, errors, warnings }
loadBundle(jsonString);
// -> { ok: true, bundle, manifest, warnings } | { ok: false, errors, warnings }Each diagnostic carries path, code, message, severity. The 8-step
pipeline (V2 spec §9.5) short-circuits on hard fails so dev panel UIs see
the first blocker rather than a cascade.
Facade v0 (Step 3)
./facade/ is the worker→host RPC surface. Step 3 wires the first
real-capability set on top of an EditorBackend abstraction:
import {
PluginManager,
FakeEditorBackend,
type NotifyEvent,
} from "@lovision/plugin-host";
const backend = new FakeEditorBackend({ initialSelection: ["a", "b"] });
const manager = new PluginManager({
editorBackend: backend,
notifySink: (e: NotifyEvent) => console.log(e.pluginId, e.message),
});
await manager.install(bundle);
await manager.invoke(pluginId, "run", { /* ... */ });
backend.getSelection(); // worker-driven selection state
backend.receivedBatches; // every nodes.update batch the worker sentEditorBackend is the seam that Step 4 swaps for the real
EngineEditorBackend (engine.applyTransaction); FakeEditorBackend
exists for tests and for any consumer that wants an in-memory stand-in.
PluginManager.invoke() builds a per-invoke facade dispatcher (with
pluginId, manifest, editorBackend, notifySink, and a
requestTerminate callback bound into a FacadeContext) and attaches it
to the host before start(). Permission gating (V2 spec §11.5) runs
before every backend call; the manifest's permissions array is already
sugar-expanded by the Step 2 loader so the gate is just a set-check.
PluginRuntimeHost.terminate(reason?) accepts the unified
TerminationReason (user-abort / plugin-closed / heartbeat-timeout
/ watchdog-timeout / init-mismatch / internal) and surfaces it via
getTerminationReason(). The reason is propagated into in-flight RPC
ABORTED rejects via error.data.reason so callers can distinguish a
worker-initiated closePlugin from a genuine crash.
Worker→host RPCs whose method names start with __ are rejected with
INVALID_PARAMS (reverse-abuse guard, Step 2 §坑 #4).
Engine integration (Step 4, subpath /engine)
Real-engine wiring lives behind an optional subpath so the main package
stays @lovision/engine-free (ADR-018). Import the adapter from
@lovision/plugin-host/engine:
import { LovisionEngine } from "@lovision/engine/core";
import { PluginManager } from "@lovision/plugin-host";
import { EngineEditorBackend } from "@lovision/plugin-host/engine";
const engine = new LovisionEngine({ mock: true }); // or a real instance
const backend = new EngineEditorBackend(engine);
const manager = new PluginManager({ editorBackend: backend });
// ... install + invoke as usual
backend.dispose(); // unsubscribe on shutdownapplyNodeUpdates(updates, { expectedVersion? })bundles every update into oneengine.applyTransaction(patch)so a batch yields exactly one history entry (V2 spec §10.4 "merge into one history").expectedVersionnow uses Step 6 rebase-or-reject semantics:expectedVersion === currentVersionapplies immediately- stale versions can still apply if the intervening journal shows no touched-node overlap
- conflicts reject with
PluginError(WRITE_CONFLICT, ..., { currentVersion, expectedVersion, conflictingNodes?, missing? }) - versions older than the retained journal window reject conservatively
getDocumentSnapshot()returns the engine's flatnodes[]payload; the facadedocumentSnapshotImplprojects it into the nested frozenSceneSnapshotshape plugins see.setSelection / clearSelectionpass{ skipHistory: true }so selection changes don't dirty the undo stack separately from node updates — the 1-history-entry-per-batch contract holds.
peerDependencies.@lovision/engine is optional — omit it if you only
use FakeEditorBackend (CLI host, test harness).
Storage backend (Step 5)
PluginManager now accepts an optional storageBackend; if omitted it
uses MemoryStorageBackend, which persists plugin data across invocations
for the lifetime of the manager instance:
import {
MemoryStorageBackend,
PluginManager,
} from "@lovision/plugin-host";
const manager = new PluginManager({
storageBackend: new MemoryStorageBackend(),
});
await manager.install(bundle);
await manager.invoke(pluginId, "run", {
scope: "document",
op: "set",
key: "lint",
value: { count: 3 },
});
await manager.uninstall(pluginId, {
dataPolicy: "remove-plugin-data-only",
});storage.local.*is plugin-private and physically deleted onremove-plugin-data-onlystorage.document.*andstorage.node.*are namespace-isolated bypluginId;remove-everythingtombstones them so re-install no longer sees the values- quotas are enforced in the backend and surface as
PluginError(QUOTA_EXCEEDED, data: { scope, limitKind, ... })
Note: the engine package's
dist/core.d.tsis currently unavailable (pre-existing strict type errors in the engine source). Plugin-host shipssrc/backends/engine-shim.d.tsas an ambient declaration of the minimum surfaceEngineEditorBackendconsumes. When the engine's dts build is fixed the shim can be deleted (tracked in ADR-018 §验证).
Dev sideload session (Step 8)
DevPluginSession owns browser-only dev workflow state on top of
PluginManager: loopback URL validation, manifest/main fetch, manual
reload, HEAD/ETag polling, and last-error reporting.
import { DevPluginSession, PluginManager } from "@lovision/plugin-host";
const manager = new PluginManager({
locale: "en",
sandboxRunnerUrl: "/__dev__/plugin-runner.html",
});
const session = new DevPluginSession({ manager, pollIntervalMs: 1000 });
await session.start("https://127.0.0.1:3941/manifest.json");
session.setAutoReload(true);
const snapshot = session.getSnapshot();
if (!snapshot.pluginId) {
throw new Error("dev session did not install a plugin");
}
const result = await manager.invoke(snapshot.pluginId, "run");
await session.reload(); // manual reload
await session.stop(); // uninstall current dev plugin + stop polling- only loopback HTTPS manifest URLs are accepted:
https://localhost:*,https://127.0.0.1:*,https://[::1]:* - plain HTTP loopback rejects with an explicit "switch to HTTPS loopback" error
- reload failures preserve the last known good plugin version in
PluginManager; inspectsession.getSnapshot().lastErrorfor the latest failure text
Step 1 scope (still relevant)
Sandbox iframe isolation, real engine integration, admin permission
overrides, and write-conflict resolution remain deferred to later steps.
The Step 1 transport / runtime layer is unchanged; Step 2 added the
statusChange observer + transfer arg, Step 3 added
attachDispatchTable(dispatch) and terminate(reason), and Step 5 added
StorageBackend / MemoryStorageBackend.
