@nano-ui-kit/gateway
v0.0.1
Published
LLM transport — browser-safe streaming adapters (Anthropic, OpenAI, Gemini) plus a Node proxy that holds API keys server-side and pipes SSE responses back to the browser.
Downloads
168
Maintainers
Readme
@nano-ui-kit/gateway
LLM transport — one package, two concerns:
- Browser-safe streaming adapters (
./adapters) — unifiedchat()/streamChat()/createClient()over fetch for Anthropic, OpenAI, and Gemini. Same chunk shape across providers ({ type: 'text' | 'thinking' | 'done' | 'error', ... }). - Node proxy server (
./server) — terminates browser/api/chatrequests, attaches server-side API keys, pipes the upstream SSE stream back verbatim. Also fronts the compose pipeline's/api/generate+/api/generate/resetendpoints. The proxy reuses the same adapterbuildRequest()the browser does — one source of truth for URL, headers, body.
Private package (not published). Consumed inside the monorepo by
@nano-ui-kit/web-components (the nano-chat pattern uses the browser
adapters) and @nano-ui-kit/a2ui-compose (the generator's llm-bridge
wraps createClient).
Adapter usage
import { streamChat } from '@nano-ui-kit/gateway/adapters';
for await (const chunk of streamChat({
provider: 'anthropic', // or auto-detect from model
apiKey: 'sk-ant-...',
model: 'claude-sonnet-4-20250514',
messages: [{ role: 'user', content: 'Hello' }],
})) {
if (chunk.type === 'text') process.stdout.write(chunk.text);
}Chunk shape:
{ type: 'text', text: string, snapshot: string } — partial text delta
{ type: 'thinking', text: string } — reasoning delta (Anthropic)
{ type: 'done', text: string, usage: {...}, stopReason: string }
{ type: 'error', error: Error }Provider auto-detects from opts.model when opts.provider is omitted —
claude* → anthropic, gpt* / o1* → openai, gemini* → gemini.
Reusable client
import { createClient } from '@nano-ui-kit/gateway/adapters';
const client = createClient({ provider: 'anthropic', apiKey: '...' });
const reply = await client.chat({ model: 'claude-sonnet-4-20250514', messages: [...] });
for await (const chunk of client.stream({ model: '...', messages: [...] })) { ... }Proxy server
Start the gateway so the browser can call LLMs without exposing API keys:
ANTHROPIC_API_KEY=sk-ant-... node packages/gateway/server.jsOr via the repo-level npm script:
npm run proxyDefault port 3456 (override with PORT=). .env at the repo root is
auto-loaded (same file the MCP server reads). Startup logs show which
provider keys are present.
Endpoints
| Method | Path | Purpose |
|--------|--------------------------|--------------------------------------------------------------|
| POST | /api/chat | Chat passthrough — { provider, model, messages, system?, … } |
| POST | /api/generate | Compose pipeline — { intent, engine?, mode?, sessionId?, … } |
| POST | /api/generate/reset | Clear a zettel session — { sessionId } |
/api/chat streams text/event-stream back. /api/generate returns a
single JSON payload with the generated A2UI messages plus validation
metadata.
@llm vite alias
In Vite dev the alias @llm resolves to packages/gateway/adapters/. In
Node / published packages use the scoped specifier (@nano-ui-kit/gateway/adapters)
— Vite aliases don't exist at runtime.
License
MIT © Kim Granlund
