@proj-airi/drizzle-orm-browser-migrator
v0.1.6
Published
🦆 Drizzle ORM migrator applies migrations in browser environment, for PGLite, SQLite, DuckDB WASM!
Keywords
Readme
@proj-airi/drizzle-orm-browser-migrator
🦆 Drizzle ORM migrator applies migrations in browser environment, for PGLite, SQLite, DuckDB WASM!
[!NOTE]
This project is part of (and also associate to) the Project AIRI, we aim to build a LLM-driven VTuber like Neuro-sama (subscribe if you didn't!) if you are interested in, please do give it a try on live demo.
Installation
Pick the package manager of your choice:
ni @proj-airi/drizzle-orm-browser-migrator -D # from @antfu/ni, can be installed via `npm i -g @antfu/ni`
pnpm i @proj-airi/drizzle-orm-browser-migrator -D
yarn i @proj-airi/drizzle-orm-browser-migrator -D
npm i @proj-airi/drizzle-orm-browser-migrator -DUnoCSS usage
import { IdbFs, PGlite } from '@electric-sql/pglite'
import { migrate } from '@proj-airi/drizzle-orm-browser-migrator/pglite'
import migrations from 'drizzle-migrations.sql'
import { drizzle } from 'drizzle-orm/pglite'
const pgLite = new PGlite({ fs: new IdbFs('pglite-database') })
const db = drizzle({ client: pgLite })
await migrate(db, migrations)Other side projects born from Project AIRI
- Awesome AI VTuber: A curated list of AI VTubers and related projects
unspeech: Universal endpoint proxy server for/audio/transcriptionsand/audio/speech, like LiteLLM but for any ASR and TTShfup: tools to help on deploying, bundling to HuggingFace Spacesxsai-transformers: Experimental 🤗 Transformers.js provider for xsAI.- WebAI: Realtime Voice Chat: Full example of implementing ChatGPT's realtime voice from scratch with VAD + STT + LLM + TTS.
@proj-airi/drizzle-duckdb-wasm: Drizzle ORM driver for DuckDB WASM@proj-airi/duckdb-wasm: Easy to use wrapper for@duckdb/duckdb-wasm- Airi Factorio: Allow Airi to play Factorio
- Factorio RCON API: RESTful API wrapper for Factorio headless server console
autorio: Factorio automation librarytstl-plugin-reload-factorio-mod: Reload Factorio mod when developing- Velin: Use Vue SFC and Markdown to write easy to manage stateful prompts for LLM
demodel: Easily boost the speed of pulling your models and datasets from various of inference runtimes.inventory: Centralized model catalog and default provider configurations backend service- MCP Launcher: Easy to use MCP builder & launcher for all possible MCP servers, just like Ollama for models!
- 🥺 SAD: Documentation and notes for self-host and browser running LLMs.
