svelte-on-demand
v1.0.0
Published
Compile and serve on-demand
Downloads
98
Maintainers
Readme
⚡ Svelte On-Demand
Compile, render and hydrate Svelte components at runtime. No build step required.
Svelte On-Demand is a Node.js runtime engine that compiles and renders Svelte components on demand, directly in production.
It enables dynamic SSR without a pre-build pipeline, making it ideal for environments where Svelte code changes frequently after deployment.
🚀 Why Svelte On-Demand?
Modern Svelte setups assume that:
All code is known before deploy.
Svelte On-Demand assumes the opposite:
Code changes. The runtime adapts.
Instead of building everything upfront, components are compiled only when requested, cached by content hash, and reused until they change.
✨ Key Features
- ⚡ On-demand Svelte compilation (SSR + DOM)
- 🧠 Content-hash–based cache
- ♻️ Automatic cleanup of outdated builds
- 🔥 Real hot reload in production
- 🌐 SSR with automatic browser hydration
- 🧩 Simple Express integration
- 📦 No external bundler or build pipeline
💡 Ideal Use Cases
- CMS platforms
- Admin panels
- Multi-tenant apps
- White-label products
- Internal tools
- Rapid prototyping
- Custom SSR / template engines
If your Svelte code can change after deploy, this library is designed for you.
📦 Installation
npm install svelte-on-demandOr install dependencies manually:
npm install svelte svelte/compiler rollup rollup-plugin-svelte \
@rollup/plugin-node-resolve @rollup/plugin-commonjs express🗂 Expected Project Structure
project/
├─ components/
│ ├─ Home.svelte
│ ├─ About.svelte
├─ svelteEngine.js
├─ server.js🚀 Basic Usage
server.js
const express = require('express');
const SvelteEngine = require('svelte-on-demand');
const app = express();
const engine = new SvelteEngine();
engine.mountSvelteCache(app);
app.get(
'/view/:component',
engine.renderView('components', {
appName: 'Svelte On-Demand'
})
);
app.listen(3000, () => {
console.log('Server running at http://localhost:3000');
});Now access:
http://localhost:3000/view/Home🧠 How It Works
The
.sveltefile is read from diskA content hash (MD5) is generated
If no cached build exists:
- Compile SSR output
- Compile DOM output (via Rollup)
SSR renders HTML and CSS
The DOM bundle is loaded in the browser and hydrated
All of this happens at runtime.
🗃 Cache Strategy
Compiled artifacts are stored in:
.svelte-cache/Example:
Home.SSR.abc123.mjs
Home.DOM.abc123.mjs
Home.hash- Cache is immutable per content hash
- Any source change generates a new build
- Old builds are automatically removed
- Cache is CDN-friendly
🔥 Hot Reload in Production (Real One)
Svelte On-Demand invalidates the Node.js module cache before loading SSR output:
delete require.cache[modulePath];This guarantees:
- No stale SSR code in memory
- No process restart
- No global rebuild
- Reload happens only when the component changes
This is not frontend HMR — it’s runtime module reloading.
🧩 API
new SvelteEngine(cacheDir?)
const engine = new SvelteEngine('.svelte-cache');engine.renderView(componentsDir, globalData?)
Returns an Express middleware.
app.get(
'/view/:component',
engine.renderView('components', {
user: 'John'
})
);Final component props:
{ ...globalData, ...req.query }engine.mountSvelteCache(app)
Serves compiled DOM bundles:
engine.mountSvelteCache(app);engine.listViews(dir)
Lists available components:
engine.listViews('components');🧪 Performance Notes
- First request triggers compilation
- Subsequent requests use cached SSR
- DOM bundles are static assets
- Works very well behind a CDN (Cloudflare, Fastly)
In real production usage, compilation cost happens once per version, not per request.
⚠️ When NOT to Use
- Static marketing sites
- Fully predictable apps
- Ultra-latency-sensitive cold starts
For these cases, traditional build-based SSR still wins.
🛣 Roadmap
- Native ESM support
- Concurrent build pipeline
- Advanced cache GC (LRU / TTL)
- Plugin system
- Watch mode
- Cloudflare Workers adapter
🧪 Status
🟡 Experimental, but functional and production-tested in controlled environments.
📜 License
MIT
Build is an optimization. Runtime is a choice. 🚀
