@frankzye/llm-agent
v0.1.7
Published
Next.js chat UI with assistant-ui, AI SDK, agents, and skills
Readme
llm-agent
Personal workspace for LLM-related experiments: a Next.js chat app with assistant-ui, the Vercel AI SDK, per-agent storage, skills, optional Mem0, CLI tools, and A2A between agents. The app lives at the repository root under src/.
The npm tarball is built separately (standalone bundle + bin only); the repo root package.json is private: true so you do not accidentally publish the full dev tree.
License
This project is licensed under the MIT License.
App overview
- Agents — Each chat thread maps to
.data/agents/<id>/withconfig.json,conversation.json, optionalskills/, Mem0 data, and a per-agenttask-board.json(todo list updated via chat toolsread_task_board/update_task_boardin/api/chat). - Skills — Global catalog under the data root (
skills/,skills.json), plus per-agent skills; configurable in Settings and agent settings. - Chat API —
POST /api/chatruns the main agent pipeline (skills tools, Mem0,cli_runwith approvals,a2a_send, compaction, etc.). Logic lives insrc/lib/chat/run-chat-post.ts.
Data directory
- By default, persisted files use
<cwd>/.data/(seesrc/lib/data-root.ts). - Override with env
LLM_TASK_DATA_PATH(absolute or relative tocwd).
Example

Requirements
- Node.js 20+ (recommended)
- pnpm or npm
Run locally
From the repo root:
pnpm install
pnpm devOpen http://localhost:3000 (or the port shown in the terminal).
Useful paths
| Path | Purpose |
|------|--------|
| system_prompt.md | Base system prompt merged for all chats |
| .data/global-settings.json | Model providers, default model, CLI allowlist, skills folder path, etc. |
| .data/agents/<uuid>/ | Per-agent config.json, conversation.json, task-board.json, skills/, Mem0 |
| .data/a2a-inbox.jsonl | A2A message log (when used) |
| src/app/api/chat/route.ts | Thin wrapper; delegates to runChatPost |
| src/app/settings/page.tsx | Global settings UI (models, CLI allowlist, skills) |
| src/lib/chat/run-chat-post.ts | read_task_board / update_task_board tools (persist task-board.json) |
HTTP API (selected)
| Method | Path | Purpose |
|--------|------|--------|
| POST | /api/chat | Streaming chat (AI SDK UI message stream) |
| GET / POST | /api/agents | List / create agents |
| GET / PATCH / DELETE | /api/agents/[id] | Read / update / delete agent |
| GET / PUT | /api/agents/[id]/messages | Load / save conversation JSON |
| GET / PUT | /api/agents/[id]/task-board | Load / save task-board.json |
| GET / PATCH | /api/settings | Global settings |
Environment
Create .env.local as needed for your providers (OpenAI / Ollama / DeepSeek API keys and base URLs). Exact variables depend on Settings → General and per-agent provider selection.
Optional Mem0-related variables are used when long-term memory is enabled (see src/lib/agent/mem0-service.ts).
Scripts
| Command | Description |
|---------|-------------|
| pnpm dev | Development server (Turbopack) |
| pnpm build | Production build |
| pnpm start | Start production server |
| pnpm lint | Next.js ESLint |
| pnpm test | Jest |
| npm run prepare:npm | After npm run build, writes npm-publish/ (standalone + bin + minimal package.json) |
| npm run publish:npm | build → prepare:npm → npm publish ./npm-publish --access public |
From a dev clone you can also run ./bin/llm-agent.js after pnpm build: it starts .next/standalone/server.js when static assets were copied (postbuild), otherwise falls back to next start if next is installed.
Publish to npm (maintainers)
- Bump
versioninpackage.json(repo root). npm run publish:npm— builds the app, runsscripts/prepare-npm-publish.mjs, and publishes only the contents ofnpm-publish/(Next standalone under.next/standalone/, plusbin/llm-agent.js, and a minimalpackage.jsonwhosedependenciesare onlynext,react, andreact-dom— not the full app graph).- Or tag to run CI:
./scripts/publish-chat-tag.sh(uses rootpackage.jsonversion →chat-v*tag → GitHub Actions publishes./npm-publish).
The published package is meant as a production server bundle, not as a library of React/source imports. The tarball does not include nested node_modules (npm never packs them); npm-publish/package.json (at the root of that folder) declares next, react, and react-dom so npm install -g installs those at the package root and require("next") in the standalone server resolves correctly. Do not use npm-publish/.next/standalone/package.json as the install manifest — that file is copied by next build and is not what npm reads when you publish ./npm-publish.
Global install (llm-agent CLI)
After publishing:
npm install -g @frankzye/llm-agent
llm-agentOptional: PORT=8080 llm-agent, HOSTNAME=127.0.0.1. Data directory follows LLM_TASK_DATA_PATH / .data in the current working directory when the process runs (set cwd or env as needed).
Contributing
This is a personal repo; fork or copy under the terms of the MIT license if you find it useful.
