@despia/powersync
v1.6.2
Published
Despia Native SDK for offline-first local SQLite + real-time PowerSync sync in web-native apps
Maintainers
Readme
Despia PowerSync
Real native SQLite for Despia web-native apps, with optional cloud sync through PowerSync.
What This Package Does
@despia/powersync gives web code a typed API for a native on-device SQLite database.
It supports two layers:
| Layer | Works offline? | Requires PowerSync cloud setup? | APIs |
| ------------ | ----------------------- | ------------------------------- | --------------------------------------------------------------------------------------- |
| Local SQLite | Yes | No | init, query, get, execute, batch, transaction, watch, migrate, schema |
| Cloud sync | Yes, sync resumes later | Yes | powersync.connect, powersync.sync, powersync.status, powersync.events.status |
Local database operations do not require a cloud token. PowerSync cloud sync starts only when native has:
- a pending schema and target version from
db.init({ schema, schemaVersion }) - successful migrations that promote that schema to active
- credentials from
db.powersync.connect({ token })
Until native has an active schema and credentials, local SQLite can still work, but sync must stay off.
Installation
npm install @despia/powersyncimport { active, db } from "@despia/powersync";CDN:
import { db } from "https://cdn.jsdelivr.net/npm/@despia/powersync/+esm";UMD global:
<script src="https://cdn.jsdelivr.net/npm/@despia/powersync/dist/umd/despia-powersync.min.js"></script>
<script>
const { db } = window.powersync;
</script>Requirements
This package must run inside a Despia app with the native PowerSync bridge installed.
import { active } from "@despia/powersync";
if (!active()) {
console.warn("Native PowerSync bridge is not available.");
}active() only checks bridge presence. It does not mean SQLite is initialized or sync is connected.
Quick Start
import { db, type PowerSyncSchema } from "@despia/powersync";
const SCHEMA_VERSION = 1;
const SCHEMA: PowerSyncSchema = {
users: {
columns: {
id: "text",
email: "text",
createdAt: "text",
},
indexes: {
users_by_email: ["email"],
},
},
};
await db.init({ schema: SCHEMA, schemaVersion: SCHEMA_VERSION, databaseName: "mydb" });
await db.migrate(1, [
`CREATE TABLE IF NOT EXISTS users (
id TEXT PRIMARY KEY,
email TEXT NOT NULL,
createdAt TEXT NOT NULL
)`,
"CREATE INDEX IF NOT EXISTS users_by_email ON users(email)",
]);
await db.execute(
"INSERT INTO users(id, email, createdAt) VALUES(?, ?, ?)",
["u1", "[email protected]", new Date().toISOString()]
);
const users = await db.query<{ id: string; email: string }>(
"SELECT id, email FROM users ORDER BY email"
);Add sync only when you have credentials:
await db.powersync.connect({ token: "jwt_from_your_auth_system" });PowerSync Auth And Tokens
db.powersync.connect({ token }) gives the native PowerSync engine a token for the current signed-in app user. It does not log the user into your app, and it does not create the token.
There are three separate pieces to set up:
- Your app auth decides who the user is.
- PowerSync Client Auth verifies the token that native sends to PowerSync.
- PowerSync sync rules decide which rows that verified user can sync.
The token identifies the user. It does not contain all user data, and it does not automatically protect rows by itself. If a row is synced into local SQLite, your frontend can query it. Private data must be blocked by PowerSync/backend rules before it reaches the device.
Token Flow
Use this flow when you mint custom PowerSync JWTs:
- User signs in to your app.
- The frontend calls your backend token endpoint.
- Your backend verifies the app session.
- Your backend signs a short-lived PowerSync JWT for that user.
- The frontend calls
db.powersync.connect({ token }). - Native passes the token into the real PowerSync SDK.
- PowerSync verifies the token and applies your sync rules.
Your backend token endpoint does not call a PowerSync REST API just to create the token. It signs the JWT with the key or secret that you configured in PowerSync Client Auth. Later, the native PowerSync SDK connects to the PowerSync service with that token, and PowerSync verifies the signature, kid, audience, expiry, and subject.
PowerSync Dashboard Setup
In PowerSync, configure the auth method before expecting real sync to work:
- Open your PowerSync project or instance.
- Go to Client Auth.
- Configure the same JWT verification method your backend/provider uses.
- Configure sync rules or streams that use the authenticated user identity.
For custom JWT auth, PowerSync expects a signed JWT with fields like:
{
"sub": "user_123",
"aud": "https://your-powersync-instance-url",
"iat": 1710000000,
"exp": 1710000900,
"userId": "user_123"
}The JWT header should include the configured key id:
{
"alg": "HS256",
"kid": "your-key-id"
}Important fields:
| Field | Purpose |
| ------------- | -------------------------------------------------------------------------------- |
| sub | The application user ID. PowerSync rules use this as the user identity. |
| aud | Must match the PowerSync instance URL or configured audience. |
| iat / exp | Token issue and expiry times. Keep expiry short. |
| kid | Header key id. Must match the key configured in PowerSync Client Auth. |
| custom claims | Optional app data such as team, role, or project claims if your rules need them. |
For production, asymmetric JWT signing with JWKS is usually preferred because PowerSync can verify tokens with a public key while your backend keeps the private signing key. HS256 can work for development or simpler deployments if it exactly matches your PowerSync Client Auth configuration.
Sync Rules
PowerSync verifies who the user is from the token. Your sync rules decide what that user can sync.
A simple rule is conceptually:
SELECT * FROM todos WHERE user_id = auth.user_id()If the JWT has sub = "user_123", rules should only sync rows authorized for user_123. For team, organization, or project data, include the required claims in the token or resolve membership in your backend/PowerSync rules. Do not issue broad tokens that allow every user to sync every row unless that is truly intended.
Before shipping, test with two users and confirm user A never receives user B's private rows.
Client Code
import { db } from "@despia/powersync";
async function getPowerSyncToken() {
const response = await fetch("/api/powersync-token", {
method: "POST",
credentials: "include",
headers: {
"Content-Type": "application/json",
},
});
if (!response.ok) {
throw new Error("Could not get PowerSync token");
}
const data = (await response.json()) as {
token: string;
expiresAt?: string;
};
return data.token;
}
const token = await getPowerSyncToken();
await db.powersync.connect({ token });Backend Token Endpoint
This example uses HS256 because it is compact. Match the algorithm, kid, secret/key, audience, and claims to your own PowerSync Client Auth settings.
import express from "express";
import { SignJWT } from "jose";
const app = express();
app.use(express.json());
app.post("/api/powersync-token", async (req, res) => {
const user = await getUserFromSession(req);
if (!user) {
return res.status(401).json({ error: "not_authenticated" });
}
const secret = Buffer.from(process.env.POWERSYNC_JWT_SECRET!, "base64url");
const expiresAt = Math.floor(Date.now() / 1000) + 15 * 60;
const token = await new SignJWT({
userId: user.id,
})
.setProtectedHeader({
alg: "HS256",
kid: process.env.POWERSYNC_JWT_KID!,
})
.setSubject(user.id)
.setIssuer(process.env.POWERSYNC_JWT_ISSUER!)
.setAudience(process.env.POWERSYNC_URL!)
.setIssuedAt()
.setExpirationTime(expiresAt)
.sign(secret);
return res.json({
token,
expiresAt: new Date(expiresAt * 1000).toISOString(),
});
});Keep signing keys and secrets on the server. The frontend should receive only a short-lived token for the authenticated user.
Supabase And Firebase
If your app already uses Supabase Auth or Firebase Auth, you may not need a custom token endpoint. PowerSync can be configured to verify those provider JWTs directly when Client Auth, audience, and JWKS settings match your provider.
Helpful references:
Native integration note: native does not usually implement row-level authorization itself. Native passes the token into the real PowerSync SDK and reports real sync status/errors. Row-level authorization belongs in your PowerSync/backend rules.
Startup Lifecycle
Use this order on every app start:
await db.init({
schema: CURRENT_SCHEMA,
schemaVersion: SCHEMA_VERSION,
databaseName: DATABASE_NAME,
});
const activeSchema = await db.schema().catch(() => null);
const appliedVersion = activeSchema?.appliedMigrationVersion ?? 0;
const pendingMigrations = MIGRATIONS.filter(
(migration) => migration.version > appliedVersion
);
if (pendingMigrations.length > 0) {
await db.migrate(
SCHEMA_VERSION,
pendingMigrations.flatMap((migration) => migration.statements)
);
}
if (token) {
await db.powersync.connect({ token });
}Why this order matters:
init()gives native the latest schema and target schema version as pending state.schema()returns the last active schema, if native already has one cached.migrate()applies all pending SQL in one transaction.- Native promotes pending schema to active only after the migration reaches
schemaVersion. db.powersync.connect()starts sync only after active schema and credentials exist.
Schema Model
The schema is a JSON-compatible object. It describes table and column shape for native PowerSync setup.
type PowerSyncColumnType = "text" | "integer" | "real";
type PowerSyncSchema = Record<
string,
{
columns: Record<string, PowerSyncColumnType>;
indexes?: Record<string, string[]>;
}
>;Full schema example:
{
"users": {
"columns": {
"id": "text",
"email": "text",
"createdAt": "text"
},
"indexes": {
"users_by_email": ["email"]
}
},
"posts": {
"columns": {
"id": "text",
"userId": "text",
"title": "text",
"body": "text",
"createdAt": "text"
},
"indexes": {
"posts_by_user": ["userId"]
}
}
}Valid schema rules:
schemamust be a non-empty object.schemaVersionmust be a positive integer and should match the latest migration version for this schema.- Table names must be non-empty strings.
- Each table must have a non-empty
columnsobject. - Column names must be non-empty strings.
- Column types must be exactly
"text","integer", or"real". indexesis optional.- If present,
indexesmust be an object. - Each index must map to a non-empty string array.
- Each indexed column must exist in that table's
columns.
Migrations
Schema describes the expected shape. Migration SQL changes the actual SQLite file. You usually need both.
Keep schema and migrations together:
import type { PowerSyncSchema } from "@despia/powersync";
export const DATABASE_NAME = "mydb";
export const SCHEMA_VERSION = 2;
export const CURRENT_SCHEMA: PowerSyncSchema = {
users: {
columns: {
id: "text",
email: "text",
createdAt: "text",
},
indexes: {
users_by_email: ["email"],
},
},
posts: {
columns: {
id: "text",
userId: "text",
title: "text",
body: "text",
createdAt: "text",
},
indexes: {
posts_by_user: ["userId"],
},
},
};
export const MIGRATIONS = [
{
version: 1,
statements: [
"CREATE TABLE IF NOT EXISTS users (id TEXT PRIMARY KEY, email TEXT NOT NULL, createdAt TEXT NOT NULL)",
"CREATE INDEX IF NOT EXISTS users_by_email ON users(email)",
],
},
{
version: 2,
statements: [
"CREATE TABLE IF NOT EXISTS posts (id TEXT PRIMARY KEY, userId TEXT NOT NULL, title TEXT NOT NULL, body TEXT, createdAt TEXT NOT NULL)",
"CREATE INDEX IF NOT EXISTS posts_by_user ON posts(userId)",
],
},
];When schema changes:
- Update
CURRENT_SCHEMA. - Increase
SCHEMA_VERSION. - Add a migration with the new version.
- Run all pending migration statements with
db.migrate(SCHEMA_VERSION, pendingStatements). - Only then run queries or sync that depend on the new shape.
schemaVersion matters because native should not promote the pending schema too early. If your current schema is version 3 and the device still needs migrations 1, 2, and 3, pass all pending statements to one db.migrate(3, statements) call. That lets native commit or roll back the full upgrade as one transaction.
Add A Column
await db.migrate(3, [
"ALTER TABLE users ADD COLUMN displayName TEXT",
]);Add A Table
await db.migrate(4, [
"CREATE TABLE IF NOT EXISTS comments (id TEXT PRIMARY KEY, postId TEXT NOT NULL, body TEXT NOT NULL)",
"CREATE INDEX IF NOT EXISTS comments_by_post ON comments(postId)",
]);Rename Or Reshape A Table
Use copy-and-swap:
await db.migrate(5, [
`CREATE TABLE users_new (
id TEXT PRIMARY KEY,
email TEXT NOT NULL,
displayName TEXT,
createdAt TEXT NOT NULL
)`,
`INSERT INTO users_new (id, email, displayName, createdAt)
SELECT id, email, NULL, createdAt FROM users`,
"DROP TABLE users",
"ALTER TABLE users_new RENAME TO users",
]);Schema Validation Errors
db.init({ schema, schemaVersion }) validates schema and options before native is called. If validation fails, the SDK throws PowerSyncError.
type PowerSyncErrorDetail = {
path?: string;
reason?: string;
expected?: string | string[];
received?: string;
[key: string]: unknown;
};
type PowerSyncError = Error & {
code?: string;
details?: PowerSyncErrorDetail[];
nativeError?: string;
};Intentionally bad schema example:
Do not copy this into your app. This snippet is only here to show what validation errors look like.
await db.init({
schema: {
users: {
columns: {
id: "text",
// Bad: "varchar" is not a supported schema type.
// Use "text", "integer", or "real".
age: "varchar" as never,
},
indexes: {
// Bad: this index points to "row", but there is no "row" column.
// Indexes must reference columns that exist in the same table.
users_by_row: ["row"],
},
},
},
schemaVersion: 1,
});Thrown error shape:
{
"name": "PowerSyncError",
"code": "invalid_schema",
"message": "invalid_schema: 2 schema validation errors",
"details": [
{
"path": "schema.users.columns.age",
"reason": "invalid_column_type",
"expected": ["text", "integer", "real"],
"received": "varchar"
},
{
"path": "schema.users.indexes.users_by_row",
"reason": "unknown_index_column",
"expected": "existing column name",
"received": "row"
}
]
}Handle errors by code and details:
try {
await db.migrate(3, MIGRATION_3);
} catch (error) {
const err = error as { code?: string; details?: Array<Record<string, unknown>> };
if (err.code === "invalid_schema") {
for (const detail of err.details ?? []) {
const expected = Array.isArray(detail.expected)
? detail.expected.join(", ")
: detail.expected;
console.error(`${detail.path}: expected ${expected}, received ${detail.received}`);
}
}
throw error;
}Validation Reason Reference
The Example received column shows bad input values that trigger each error. Do not use those values in a real schema.
| Reason | Path example | Expected | Example received |
| ------------------------------ | ------------------------------- | -------------------------------------------- | ------------------------------ |
| missing_or_invalid_schema | schema | non-empty object | undefined, null, array |
| empty_schema | schema | object with at least one table | empty object |
| empty_table_name | schema | non-empty table name | empty string |
| invalid_table_definition | schema.users | object with columns map | string, array, null |
| invalid_columns | schema.users.columns | non-empty object | string, array, null |
| empty_columns | schema.users.columns | object with at least one column | empty object |
| empty_column_name | schema.users.columns | non-empty column name | empty string |
| invalid_column_type | schema.users.columns.age | ["text", "integer", "real"] | varchar, number, boolean |
| invalid_indexes | schema.users.indexes | object mapping index names to column arrays | string, array, null |
| empty_index_name | schema.users.indexes | non-empty index name | empty string |
| invalid_index_columns | schema.users.indexes.by_email | non-empty string array | empty array, string, null |
| invalid_index_column_name | schema.users.indexes.by_email | non-empty string | empty string, number |
| unknown_index_column | schema.users.indexes.by_row | existing column name | row |
| invalid_schema_version | options.schemaVersion | positive integer | 0, 1.5, string |
| invalid_database_name | options.databaseName | non-empty string | empty string, number |
| invalid_migration_version | version | positive integer | 0, 1.5, string |
| invalid_migration_statements | statements | non-empty string array or BatchStatement[] | empty array, null |
| invalid_migration_sql | statements.0 | non-empty SQL string | empty string, number |
| empty_migration_sql | statements.0.sql | non-empty SQL string | empty string |
| invalid_token | config.token | non-empty string | empty string, undefined |
| invalid_url | config.url | non-empty string | empty string, number |
Fallback Flow
If a schema upgrade fails, do not start sync with the new schema.
async function setupDatabase(token?: string) {
await db.init({
schema: CURRENT_SCHEMA,
schemaVersion: SCHEMA_VERSION,
databaseName: DATABASE_NAME,
});
try {
const activeSchema = await db.schema().catch(() => null);
const appliedVersion = activeSchema?.appliedMigrationVersion ?? 0;
const pendingMigrations = MIGRATIONS.filter(
(migration) => migration.version > appliedVersion
);
if (pendingMigrations.length > 0) {
await db.migrate(
SCHEMA_VERSION,
pendingMigrations.flatMap((migration) => migration.statements)
);
}
if (token) await db.powersync.connect({ token });
return { mode: "ready" as const };
} catch (error) {
console.error("Database setup failed:", error);
try {
const state = await db.schema();
return {
mode: "fallback" as const,
schema: state.schema,
databaseName: state.databaseName,
};
} catch {
return { mode: "blocked" as const };
}
}
}Fallback mode is only safe if your current app can run against the active schema returned by db.schema().
Loading Synced Data
PowerSync sync writes remote changes into the same local SQLite database that your app queries. Your frontend does not usually receive a separate “sync payload” from PowerSync. Instead, the app reads JSON rows from local SQLite after native sync updates the database.
Use this model:
- Initialize schema and migrations.
- Connect PowerSync with a user-scoped token.
- Native starts or resumes sync.
- Remote changes are applied to local SQLite by native PowerSync.
- Your app reads data with
query()/get(). - Your app listens for local result changes with
watch(). - Your app listens for sync connection/progress changes with
db.powersync.events.status().
Initial load:
type PostRow = {
id: string;
userId: string;
title: string;
body: string | null;
createdAt: string;
};
await db.powersync.connect({ token });
await db.powersync.sync();
const posts = await db.query<PostRow>(
"SELECT id, userId, title, body, createdAt FROM posts WHERE userId = ? ORDER BY createdAt DESC",
[currentUserId]
);
console.log(posts);The returned rows are normal JSON-compatible objects:
[
{
"id": "post_1",
"userId": "user_1",
"title": "Hello",
"body": "First synced post",
"createdAt": "2026-04-29T08:30:00.000Z"
},
{
"id": "post_2",
"userId": "user_1",
"title": "Offline draft",
"body": null,
"createdAt": "2026-04-29T08:35:00.000Z"
}
]Live UI updates:
const unwatch = db.watch<PostRow>(
"SELECT id, userId, title, body, createdAt FROM posts WHERE userId = ? ORDER BY createdAt DESC",
[currentUserId],
(rows) => {
// Re-render your UI with the latest local rows.
renderPosts(rows);
}
);
// Later, when leaving the screen:
unwatch();watch() is the right API when you want to show updated data in the UI. It returns full result rows after local SQLite changes. Those changes can come from local writes, migrations, or remote PowerSync sync applying backend changes.
Sync status updates:
const unsubscribe = db.powersync.events.status((status) => {
if (status.uploading || status.downloading) {
showSyncIndicator();
}
if (status.connected && status.lastSynced) {
showLastSynced(status.lastSynced);
}
});
unsubscribe();db.powersync.events.status() is for sync state, not row data. Use it to show “syncing”, “offline”, “last synced”, or error states. Use query() and watch() to read actual app data.
Runtime JSON Examples
These examples show SDK-level data shapes. They are not raw native bridge payloads.
Active Schema State
{
"schema": {
"users": {
"columns": {
"id": "text",
"email": "text",
"createdAt": "text"
},
"indexes": {
"users_by_email": ["email"]
}
}
},
"databaseName": "mydb",
"schemaHash": "2d711642b726b04401627ca9fbac32f5",
"schemaVersion": 2,
"appliedMigrationVersion": 2
}Sync Status
{
"connected": true,
"lastSynced": "2026-04-29T08:30:00.000Z",
"uploading": false,
"downloading": false
}Native Error Normalized By The SDK
{
"name": "PowerSyncError",
"code": "schema_required",
"message": "schema_required — Call db.init({ schema, schemaVersion }) at app startup before using sync features.",
"nativeError": "schema_required",
"details": [
{
"path": "schema",
"reason": "empty_schema",
"expected": "object with at least one table",
"received": "empty object"
}
]
}Invalid Options Error
{
"name": "PowerSyncError",
"code": "invalid_options",
"message": "invalid_options: db.init() databaseName must be a non-empty string when provided",
"details": [
{
"path": "options.databaseName",
"reason": "invalid_database_name",
"expected": "non-empty string",
"received": ""
}
]
}Migration Error
{
"name": "PowerSyncError",
"code": "migration_validation_failed",
"message": "migration_validation_failed — Migration failed. Fix the migration SQL and try db.migrate() again.",
"nativeError": "migration_validation_failed",
"details": [
{
"path": "schema.posts.columns.userId",
"reason": "missing_column_after_migration",
"expected": "column exists after migration",
"received": "missing"
}
]
}API Examples
db.schema()
const state = await db.schema();
console.log(state.schema);
console.log(state.databaseName);
console.log(state.appliedMigrationVersion);db.query()
const posts = await db.query<{ id: string; title: string }>(
"SELECT id, title FROM posts WHERE userId = ? ORDER BY createdAt DESC",
["user_1"]
);db.get()
const post = await db.get<{ id: string; title: string }>(
"SELECT id, title FROM posts WHERE id = ?",
["post_1"]
);
if (!post) {
console.log("Post not found");
}db.execute()
const result = await db.execute(
"INSERT INTO posts(id, userId, title, body, createdAt) VALUES(?, ?, ?, ?, ?)",
["post_1", "user_1", "Hello", "Body", new Date().toISOString()]
);
console.log(result.rowsAffected);db.batch()
await db.batch([
{
sql: "INSERT INTO posts(id, userId, title, createdAt) VALUES(?, ?, ?, ?)",
params: ["post_2", "user_1", "Second", new Date().toISOString()],
},
{
sql: "INSERT INTO posts(id, userId, title, createdAt) VALUES(?, ?, ?, ?)",
params: ["post_3", "user_1", "Third", new Date().toISOString()],
},
]);db.transaction()
await db.transaction(async (tx) => {
await tx.execute("UPDATE posts SET title = ? WHERE id = ?", ["New title", "post_1"]);
await tx.execute("UPDATE posts SET body = ? WHERE id = ?", ["New body", "post_1"]);
});db.watch()
const unwatch = db.watch<{ id: string; title: string }>(
"SELECT id, title FROM posts WHERE userId = ? ORDER BY createdAt DESC",
["user_1"],
(rows) => {
console.log("Rows changed:", rows);
}
);
unwatch();db.powersync.connect()
const token = await getPowerSyncTokenFromYourServer();
await db.powersync.connect({ token });Native may read static PowerSync app ID / URL values from native config. Passing only a user-scoped token is valid when native owns the static config.
db.powersync.status()
const status = await db.powersync.status();
if (status.connected) {
console.log("Connected. Last synced:", status.lastSynced);
}db.powersync.sync()
await db.powersync.sync();
const status = await db.powersync.status();
console.log(status);Sync usually completes asynchronously. Treat db.powersync.sync() as a trigger/schedule request, then read status with db.powersync.status() or db.powersync.events.status().
db.powersync.events.status()
const unsubscribe = db.powersync.events.status((status) => {
console.log("Sync status changed:", status);
});
unsubscribe();API Reference
Exports
import {
active,
db,
Database,
onEvent,
type BatchResult,
type BatchStatement,
type ExecuteResult,
type PowerSyncConfig,
type PowerSyncColumnType,
type PowerSyncError,
type PowerSyncErrorDetail,
type PowerSyncErrorDetails,
type PowerSyncInitOptions,
type PowerSyncSchema,
type PowerSyncSchemaState,
type PowerSyncTableSchema,
type SyncStatus,
} from "@despia/powersync";Methods
active(): boolean;
db.init(options: {
schema: PowerSyncSchema;
schemaVersion: number;
databaseName?: string;
}): Promise<Record<string, unknown>>;
db.schema(): Promise<PowerSyncSchemaState>;
db.query<T>(sql: string, params?: unknown[]): Promise<T[]>;
db.get<T>(sql: string, params?: unknown[]): Promise<T | null>;
db.execute(sql: string, params?: unknown[]): Promise<ExecuteResult>;
db.batch(statements: BatchStatement[]): Promise<BatchResult>;
db.transaction<T>(fn: (tx: Database) => Promise<T>): Promise<T>;
db.watch<T>(sql: string, callback: (rows: T[]) => void): () => void;
db.watch<T>(
sql: string,
params: unknown[],
callback: (rows: T[]) => void
): () => void;
db.migrate(
version: number, // usually SCHEMA_VERSION for the full pending upgrade
statements: string[] | BatchStatement[]
): Promise<Record<string, unknown>>;
db.powersync.connect(config: PowerSyncConfig): Promise<Record<string, unknown>>;
db.powersync.sync(): Promise<Record<string, unknown>>;
db.powersync.status(): Promise<SyncStatus>;
db.powersync.events.status(callback: (status: SyncStatus) => void): () => void;
onEvent<T = unknown>(event: string, callback: (payload: T) => void): () => void;Error Codes
| Code | Source | Meaning | Typical fix |
| ----------------------------- | ------------- | ----------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------ |
| schema_required | SDK or native | Missing/empty schema, or no active schema exists for a schema-dependent native operation. | Call db.init({ schema, schemaVersion }), apply migrations, then retry. |
| invalid_schema | SDK or native | Schema shape is malformed. | Check error.details[] and fix the listed paths. |
| invalid_options | SDK or native | Method options are malformed. | Pass the documented options object and valid field types. |
| credentials_required | Native | Sync needs a token. | Call db.powersync.connect({ token }). |
| sync_not_configured | Native | Native PowerSync app ID / URL config is missing. | Fix native PowerSync config. |
| sync_not_initialized | Native | Native sync engine could not start or is not ready. | Ensure active schema, migrations, credentials, and native SDK setup are complete. |
| migration_validation_failed | Native | Migration SQL failed or expected schema shape was not reached. | Fix SQL or schema before retrying db.migrate(). |
| database_not_initialized | Native | Native SQLite database is not open. | Check native bridge/database setup. |
| request_timeout | SDK | Native did not respond to a bridge request within the timeout. | Ensure native calls window.nativeBridgeResponse(...) with the matching request id. |
Notes For AI-Generated Code
Common mistakes to avoid:
- Do not call
db.powersync.connect()beforeinit()and migrations. - Do not update schema without adding a migration.
- Do not forget to increase
schemaVersionwhen the schema changes. - Do not assume
active()means the DB is initialized. - Do not pass arbitrary column types like
"string","bool","date", or"json". Use"text","integer", or"real". - Do not create indexes for columns that do not exist.
- Do not ignore
db.migrate()errors. - Do not start sync after failed schema validation or failed migration.
License
MIT
