@tracoco/hub
v0.0.6
Published
<a href="https://github.com/haymant/tracohub"> <img alt="@tracoco/hub" src="app/(chat)/opengraph-image.png"> <h1 align="center">@tracoco/hub</h1> </a>
Downloads
620
Readme
Quick Start
Run Tracohub directly with npx:
export AUTH_SECRET="$(openssl rand -base64 32)"
export OPENAI_COMPATIBLE_BASE_URL="https://api.openai.com/v1"
export OPENAI_COMPATIBLE_API_KEY="your-key"
npx @tracoco/hub --port 3001This starts Tracohub from a managed local app home, opens the browser automatically, and keeps local data under your user data directory. If you omit provider env vars, the app still boots and you can add providers later from Settings.
To scaffold an editable project instead:
npx @tracoco/hub my-chat-app
cd my-chat-app
pnpm devThe scaffold copies .env.example to .env.local automatically. By default, Tracohub stores chat data in an embedded PGlite database under ./.data/pglite. If you set DATABASE_URL or POSTGRES_URL, the app uses remote Postgres instead. Uploads go to Vercel Blob when BLOB_READ_WRITE_TOKEN is set; otherwise they are stored locally under ./.data/uploads.
Features
- Project-scoped chats with persistent history and resumable streams.
- Multiple provider backends, including OpenAI-compatible endpoints plus Pi, Claude, Copilot, and OpenClaw routing hooks.
- Provider selection through settings and explicit route mentions such as
@provider/agent. - Local-first persistence with embedded PGlite by default and remote Postgres when configured.
- Authenticated user accounts with encrypted provider and channel secrets.
- Early channel foundations for Telegram and WhatsApp, including settings, allowlists, link tokens, and conversation-mapping data models.
Providers
Tracohub supports built-in and user-managed providers. The app can boot without a provider, but you need at least one configured provider before normal chat execution will be useful.
OpenAI-compatible provider
Set OPENAI_COMPATIBLE_BASE_URL and OPENAI_COMPATIBLE_API_KEY in your .env.local file. The app will fetch ${OPENAI_COMPATIBLE_BASE_URL}/v1/models to populate the model selector and send chat completions to ${OPENAI_COMPATIBLE_BASE_URL}/v1/chat/completions.
Other providers
From Settings, you can add additional provider definitions for Claude, Copilot, OpenClaw, and other OpenAI-compatible backends. Secrets are encrypted at rest with AUTH_SECRET.
Channels
Telegram and WhatsApp settings are available under Settings -> Channels. After saving channel credentials, start the runtime bridge with pnpm channels:worker so polling, QR pairing, and status updates can begin. The web app intentionally does not own those long-lived listeners: Next handles the UI, database, and AI execution lifecycle, while the worker owns transport polling loops, WhatsApp session state, and reconnect behavior.
Telegram setup
- Create a bot with
@BotFatherand copy the bot token. - Open Settings -> Channels and add a Telegram channel.
- Choose
pollingorwebhooktransport. - Paste the bot token.
- Optionally add a webhook secret when using webhook transport.
- Set a default project if you want new linked chats to land in a specific project.
- Add allowed Telegram user IDs if you want DM access restricted from the start.
Current status:
- Configuration is persisted and encrypted.
- Link tokens, external identities, and mapped conversations are modeled in the database.
pnpm channels:workerstarts Telegram polling for channels configured withpollingtransport.- In local embedded mode, the worker talks to the running web app over internal runtime APIs instead of opening the embedded PGlite store directly.
- Telegram webhook transport is not implemented yet in the worker.
- The worker supports
/helpand/link <CODE>and can send real assistant replies for linked conversations.
WhatsApp setup
- Open Settings -> Channels and add a WhatsApp channel.
- Set a default project if desired.
- Add allowed phone numbers if you want DM access restricted from the start.
- Save the channel configuration.
Current status:
- Configuration, allowlists, and conversation-linking metadata are persisted.
pnpm channels:workerstarts a long-lived Baileys worker and publishes QR state back into Settings -> Channels.- The worker supports
/helpand/link <CODE>and can send real assistant replies for linked conversations.
Current channel limitations
- The channel worker must be running for Telegram activity or WhatsApp QR/session state to appear.
pnpm devstarts the web app only. Keeping the worker separate avoids coupling long-lived polling and QR/session state to Next's hot-reload lifecycle.pnpm dev:allstarts both the web app and the worker for local development.pnpm channels:stopsendsSIGINTto local Tracohub channel worker processes. If the worker was started underpnpm dev:all, that stack will stop as well.npx @tracoco/hub --workerstarts only the worker process and expects a reachable Tracohub web app.- Telegram polling allows only one active
getUpdatesconsumer per bot token. If you see a conflict about anothergetUpdatesrequest, stop the duplicate worker before restarting Tracohub. - Telegram webhook transport is still unsupported; use
pollingfor now. - Linked channel conversations use the Tracohub web runtime for assistant replies, but the worker still focuses on transport, linking, and session state rather than full browser-style streaming UX.
Getting a /link code
- Open Settings -> Channels.
- Make sure the channel has a default project selected.
- Open the saved channel card and click
Generate /link code. - Send
/link CODEfrom the Telegram DM before the code expires.
The current UI issues link codes against the channel's default project and creates a new linked chat there when redeemed.
Running locally
You can run Tracohub locally without provisioning external services. The app creates an embedded PGlite database automatically unless you opt into remote Postgres with DATABASE_URL or POSTGRES_URL.
Note: You should not commit your
.envfile or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts.
- Copy
.env.exampleto.env.localif it does not already exist. - Set
AUTH_SECRET. - Optionally set
OPENAI_COMPATIBLE_BASE_URLandOPENAI_COMPATIBLE_API_KEYfor a built-in provider. - Optionally configure additional providers from Settings after the app starts.
- Optionally set
DATABASE_URLto use remote Postgres instead of embedded PGlite. - Optionally set
BLOB_READ_WRITE_TOKENfor Vercel Blob uploads orREDIS_URLfor Redis-backed rate limiting.
pnpm install
pnpm db:migrate
pnpm devOpen http://localhost:3000, sign in, then configure providers and channels from Settings. In development, 127.0.0.1 is redirected to localhost so browser session state stays on one origin.
For a single local development command that starts both the web app and the channel worker:
pnpm dev:allIf you want to run only the worker against an already running Tracohub app:
TRACOHUB_APP_URL=http://localhost:3000 pnpm channels:workerThe published CLI exposes the same worker-only mode:
npx @tracoco/hub --worker --app-url http://localhost:3000When you run the published worker from an existing project directory, it loads the current directory's .env.local before starting the managed worker copy so the runtime worker secret can match the app you already configured.
Publish via npm
This repo can be published directly as the @tracoco/hub package. End users run npx @tracoco/hub, and the CLI starts a managed local app directly from the published package.
Use these maintainer commands:
pnpm install
pnpm package:dry-run
npm publish --access publicTo validate the packed artifact before publishing:
npm pack
npx --yes ./tracohub-<version>.tgz --port 3001
npx --yes ./tracohub-<version>.tgz demo-app