datasette-ts
v0.0.22
Published
TypeScript-first Datasette-style explorer for SQLite, with local serve and Cloudflare Workers deploy.
Downloads
2,802
Readme
datasette-ts
TypeScript-first Datasette-style explorer for SQLite, with local serve and Cloudflare Workers deploy.
Install
npm install -g datasette-ts
# or
npx datasette-ts --helpQuickstart (local)
datasette-ts serve ./my.db --port 8001Open http://127.0.0.1:8001.
Inspect a database
datasette-ts inspect ./my.db --inspect-file inspect-data.jsonDeploy to Cloudflare
First, set up Alchemy (docs):
npx alchemy configure
npx alchemy loginThen deploy:
# Deploy with name derived from filename
datasette-ts deploy cloudflare ./my.db
# Deploy with explicit name
datasette-ts deploy cloudflare ./my.db --name my-app
# Deploy with a specific Alchemy profile
datasette-ts deploy cloudflare ./my.db --name my-app --profile prod
# Deploy with metadata (robots settings, cache settings, etc.)
datasette-ts deploy cloudflare ./my.db --metadata datasette.ymlThis creates a Cloudflare Worker and D1 database with your data.
CLI help
datasette-ts --help
datasette-ts serve --help
datasette-ts inspect --help
datasette-ts deploy cloudflare --helpHTTP caching
Defaults to Cache-Control: max-age=5 on all responses. You can
override this default using a setting, or per-request using the _ttl query
string parameter:
# Set the default max-age for all responses (in seconds)
datasette-ts serve ./my.db --setting default_cache_ttl 60
# Override cache behavior for a single request
curl "http://127.0.0.1:8001/?_ttl=0"
curl "http://127.0.0.1:8001/?_ttl=120"You can also set default_cache_ttl via metadata:
settings:
default_cache_ttl: 60default_cache_ttlof0disables caching by sendingCache-Control: no-store._ttl=0disables caching for that request;_ttl=<seconds>setsmax-age.
Hashed URLs
Hashed URLs include a content hash in the database path (e.g. /mydb-abc1234/table),
enabling long-lived immutable caching. When enabled, responses include
Cache-Control: public, max-age=31536000, immutable.
Enable via metadata:
settings:
hash_urls: trueOr per-request with ?_hash=1. Cloudflare Workers enable hashed URLs by default
(set DATASETTE_HASH_URLS=0 to disable).
Robots.txt
Serve a robots.txt that blocks crawlers by default. Configure it via metadata:
{
"plugins": {
"datasette-block-robots": {
"allow_only_index": true
}
}
}Options:
allow_only_index: allow indexing the homepage while blocking each database path.disallow: custom list of paths to disallow, e.g.["/mydb/mytable"].literal: full robots.txt contents (overrides all other settings).
The /robots.txt endpoint returns the generated file with text/plain content.
Build
npm run buildBuild outputs:
dist/cli.js(Node CLI entrypoint)dist/worker.js(Cloudflare Worker entrypoint)
The deploy command prefers dist/worker.js when present, so the published package ships a ready-to-bundle worker.
Publish
npm version <patch|minor|major>
npm run build
npm pack
npm publishStatus
- Works: local serve, inspect data export, Cloudflare Workers deploy via Alchemy.
- Not yet: config files, plugins, write APIs.
