hfup
v1.0.3
Published
A collection of tools to help you deploy, bundle HuggingFace Spaces and related assets with ease.
Readme
🧑🚀 hfup
A collection of tools to help you deploy, bundle HuggingFace Spaces and related assets with ease.
Where
hfupstands for HuggingFace up, and the wordupwas inspired fromrollup,tsup, you may think it means "to make your HuggingFace work up and running".
Installation
Pick the package manager of your choice:
ni hfup -D # from @antfu/ni, can be installed via `npm i -g @antfu/ni`
pnpm i hfup -D
yarn i hfup -D
npm i hfup -DCLI
hfup can generate HuggingFace artifacts directly, without bundler hooks.
Quick start:
pnpm add -D hfupCreate hfup.config.json in your project root:
npx hfup generate --root . --outDir ./distThis command writes:
.gitattributes(LFS patterns)README.md(Space or Model card front-matter + README content)
You can also use a config file (hfup.config.json, hfup.config.mjs, hfup.config.js, hfup.config.cjs) in your project root or pass one explicitly via --config.
{
"$schema": "https://unpkg.com/hfup@latest/dist/json-schema.json",
"lfs": {
"withDefault": true,
"extraGlobs": ["*.gguf"],
"extraAttributes": ["data/**/*.bin filter=lfs diff=lfs merge=lfs -text"]
},
"spaceCard": {
"title": "My Space",
"license": "mit",
"emoji": "🚀",
"models": ["openai-community/gpt2"]
},
"modelCard": {
"language": ["en"],
"library_name": "transformers",
"pipeline_tag": "text-generation",
"base_model": "meta-llama/Llama-3.2-1B",
"license": "mit",
"tags": ["llm", "instruction-tuned"]
}
}If you pin schema by version:
{
"$schema": "https://unpkg.com/[email protected]/dist/json-schema.json"
}For projects that cannot use $schema inline, add a VS Code mapping in .vscode/settings.json:
{
"json.schemas": [
{
"fileMatch": ["hfup.config.json"],
"url": "https://unpkg.com/hfup@latest/dist/json-schema.json"
}
]
}Optional flags:
--with-lfs: generate only.gitattributes--with-space-card: generate onlyREADME.md--with-model-card: generate onlyREADME.md
--with-space-card and --with-model-card are mutually exclusive because both write README.md.
Generate model card only:
npx hfup generate --root . --outDir . --with-model-cardGenerate space card only:
npx hfup generate --root . --outDir . --with-space-cardGenerate only LFS:
npx hfup generate --root . --outDir . --with-lfsJSON Schema in Editors
Use the published JSON schema so non-TypeScript projects (Python/C/etc.) still get IntelliSense.
Inline schema in hfup.config.json:
{
"$schema": "https://unpkg.com/hfup@latest/dist/json-schema.json"
}Or pin a version:
{
"$schema": "https://unpkg.com/[email protected]/dist/json-schema.json"
}If you cannot use $schema inline, configure VS Code:
{
"json.schemas": [
{
"fileMatch": ["hfup.config.json"],
"url": "https://unpkg.com/hfup@latest/dist/json-schema.json"
}
]
}Model Card Plugin
You can generate a Hugging Face model card from bundler hooks too:
// vite.config.ts
import { defineConfig } from 'vite'
import { LFS, ModelCard } from 'hfup/vite'
export default defineConfig({
plugins: [
LFS(),
ModelCard({
language: ['en'],
library_name: 'transformers',
pipeline_tag: 'text-generation',
base_model: 'meta-llama/Llama-3.2-1B',
license: 'mit',
}),
],
})// vite.config.ts
import { defineConfig } from 'vite'
import { LFS, SpaceCard } from 'hfup/vite'
export default defineConfig({
plugins: [
LFS(),
SpaceCard({
title: 'Real-time Whisper WebGPU (Vue)',
emoji: '🎤',
colorFrom: 'gray',
colorTo: 'green',
sdk: 'static',
pinned: false,
license: 'mit',
models: ['onnx-community/whisper-base'],
short_description: 'Yet another Real-time Whisper with WebGPU, written in Vue',
thumbnail: 'https://raw.githubusercontent.com/moeru-ai/airi/refs/heads/main/packages/whisper-webgpu/public/banner.png'
})
]
})// rollup.config.js
import { LFS, SpaceCard } from 'hfup/rollup';
export default {
plugins: [
LFS(),
SpaceCard({
title: 'Real-time Whisper WebGPU (Vue)',
emoji: '🎤',
colorFrom: 'gray',
colorTo: 'green',
sdk: 'static',
pinned: false,
license: 'mit',
models: ['onnx-community/whisper-base'],
short_description: 'Yet another Real-time Whisper with WebGPU, written in Vue',
thumbnail: 'https://raw.githubusercontent.com/moeru-ai/airi/refs/heads/main/packages/whisper-webgpu/public/banner.png'
}),
],
};// webpack.config.js
const { LFS, SpaceCard } = require("hfup/webpack");
module.exports = {
/* ... */
plugins: [
LFS(),
SpaceCard({
title: 'Real-time Whisper WebGPU (Vue)',
emoji: '🎤',
colorFrom: 'gray',
colorTo: 'green',
sdk: 'static',
pinned: false,
license: 'mit',
models: ['onnx-community/whisper-base'],
short_description: 'Yet another Real-time Whisper with WebGPU, written in Vue',
thumbnail: 'https://raw.githubusercontent.com/moeru-ai/airi/refs/heads/main/packages/whisper-webgpu/public/banner.png'
}),
],
};// esbuild.config.js
import { build } from "esbuild";
import { LFS, SpaceCard } from "hfup/esbuild";
build({
/* ... */
plugins: [
LFS(),
SpaceCard({
title: 'Real-time Whisper WebGPU (Vue)',
emoji: '🎤',
colorFrom: 'gray',
colorTo: 'green',
sdk: 'static',
pinned: false,
license: 'mit',
models: ['onnx-community/whisper-base'],
short_description: 'Yet another Real-time Whisper with WebGPU, written in Vue',
thumbnail: 'https://raw.githubusercontent.com/moeru-ai/airi/refs/heads/main/packages/whisper-webgpu/public/banner.png'
}),
],
});// rspack.config.mjs
import { LFS, SpaceCard } from "hfup/rspack";
/** @type {import('@rspack/core').Configuration} */
export default {
plugins: [
LFS(),
SpaceCard({
title: 'Real-time Whisper WebGPU (Vue)',
emoji: '🎤',
colorFrom: 'gray',
colorTo: 'green',
sdk: 'static',
pinned: false,
license: 'mit',
models: ['onnx-community/whisper-base'],
short_description: 'Yet another Real-time Whisper with WebGPU, written in Vue',
thumbnail: 'https://raw.githubusercontent.com/moeru-ai/airi/refs/heads/main/packages/whisper-webgpu/public/banner.png'
})
],
};// rolldown.config.js
import { defineConfig } from "rolldown";
import { LFS, SpaceCard } from "hfup/rolldown";
export default defineConfig({
plugins: [
LFS(),
SpaceCard({
title: 'Real-time Whisper WebGPU (Vue)',
emoji: '🎤',
colorFrom: 'gray',
colorTo: 'green',
sdk: 'static',
pinned: false,
license: 'mit',
models: ['onnx-community/whisper-base'],
short_description: 'Yet another Real-time Whisper with WebGPU, written in Vue',
thumbnail: 'https://raw.githubusercontent.com/moeru-ai/airi/refs/heads/main/packages/whisper-webgpu/public/banner.png'
}),
],
});Features
- Still manually writing HuggingFace Spaces configurations?
- Having trouble to quickly handle and edit the
.gitattributesfile for Git LFS? - Don't want any of the HuggingFace Spaces front-matter appear in
README.md? - Fighting against annoying errors when deploying your HuggingFace Spaces?
hfup is here to help you!
- 🚀 Automatically...
- generate
.gitattributesfile for Git LFS. - generate HuggingFace Spaces front-matter in
README.md. - search for your
README.mdfile and merge the front-matter header. - generate a dedicated
README.mdfile right inside theoutDirof build.
- generate
- 🔐 Intellisense ready, type safe for Spaces configurations.
- 📦 Out of the box support for Vite.
What will happen
After bundling, a dedicated README with merged front-matter header will be generated in the root of your project:
---
title: Real-time Whisper WebGPU (Vue)
emoji: 🎤
colorFrom: gray
colorTo: green
sdk: static
pinned: false
license: mit
models:
- onnx-community/whisper-base
short_description: Yet another Real-time Whisper with WebGPU, written in Vue
thumbnail: https://raw.githubusercontent.com/moeru-ai/airi/refs/heads/main/packages/whisper-webgpu/public/banner.png
---
# Real-time Whisper WebGPU (Vue)