npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@frankzye/llm-agent

v0.1.7

Published

Next.js chat UI with assistant-ui, AI SDK, agents, and skills

Readme

llm-agent

Personal workspace for LLM-related experiments: a Next.js chat app with assistant-ui, the Vercel AI SDK, per-agent storage, skills, optional Mem0, CLI tools, and A2A between agents. The app lives at the repository root under src/.

The npm tarball is built separately (standalone bundle + bin only); the repo root package.json is private: true so you do not accidentally publish the full dev tree.

License

This project is licensed under the MIT License.


App overview

  • Agents — Each chat thread maps to .data/agents/<id>/ with config.json, conversation.json, optional skills/, Mem0 data, and a per-agent task-board.json (todo list updated via chat tools read_task_board / update_task_board in /api/chat).
  • Skills — Global catalog under the data root (skills/, skills.json), plus per-agent skills; configurable in Settings and agent settings.
  • Chat APIPOST /api/chat runs the main agent pipeline (skills tools, Mem0, cli_run with approvals, a2a_send, compaction, etc.). Logic lives in src/lib/chat/run-chat-post.ts.

Data directory

  • By default, persisted files use <cwd>/.data/ (see src/lib/data-root.ts).
  • Override with env LLM_TASK_DATA_PATH (absolute or relative to cwd).

Example

Chat app screenshot

Requirements

  • Node.js 20+ (recommended)
  • pnpm or npm

Run locally

From the repo root:

pnpm install
pnpm dev

Open http://localhost:3000 (or the port shown in the terminal).

Useful paths

| Path | Purpose | |------|--------| | system_prompt.md | Base system prompt merged for all chats | | .data/global-settings.json | Model providers, default model, CLI allowlist, skills folder path, etc. | | .data/agents/<uuid>/ | Per-agent config.json, conversation.json, task-board.json, skills/, Mem0 | | .data/a2a-inbox.jsonl | A2A message log (when used) | | src/app/api/chat/route.ts | Thin wrapper; delegates to runChatPost | | src/app/settings/page.tsx | Global settings UI (models, CLI allowlist, skills) | | src/lib/chat/run-chat-post.ts | read_task_board / update_task_board tools (persist task-board.json) |

HTTP API (selected)

| Method | Path | Purpose | |--------|------|--------| | POST | /api/chat | Streaming chat (AI SDK UI message stream) | | GET / POST | /api/agents | List / create agents | | GET / PATCH / DELETE | /api/agents/[id] | Read / update / delete agent | | GET / PUT | /api/agents/[id]/messages | Load / save conversation JSON | | GET / PUT | /api/agents/[id]/task-board | Load / save task-board.json | | GET / PATCH | /api/settings | Global settings |

Environment

Create .env.local as needed for your providers (OpenAI / Ollama / DeepSeek API keys and base URLs). Exact variables depend on Settings → General and per-agent provider selection.

Optional Mem0-related variables are used when long-term memory is enabled (see src/lib/agent/mem0-service.ts).

Scripts

| Command | Description | |---------|-------------| | pnpm dev | Development server (Turbopack) | | pnpm build | Production build | | pnpm start | Start production server | | pnpm lint | Next.js ESLint | | pnpm test | Jest | | npm run prepare:npm | After npm run build, writes npm-publish/ (standalone + bin + minimal package.json) | | npm run publish:npm | buildprepare:npmnpm publish ./npm-publish --access public |

From a dev clone you can also run ./bin/llm-agent.js after pnpm build: it starts .next/standalone/server.js when static assets were copied (postbuild), otherwise falls back to next start if next is installed.

Publish to npm (maintainers)

  1. Bump version in package.json (repo root).
  2. npm run publish:npm — builds the app, runs scripts/prepare-npm-publish.mjs, and publishes only the contents of npm-publish/ (Next standalone under .next/standalone/, plus bin/llm-agent.js, and a minimal package.json whose dependencies are only next, react, and react-dom — not the full app graph).
  3. Or tag to run CI: ./scripts/publish-chat-tag.sh (uses root package.json version → chat-v* tag → GitHub Actions publishes ./npm-publish).

The published package is meant as a production server bundle, not as a library of React/source imports. The tarball does not include nested node_modules (npm never packs them); npm-publish/package.json (at the root of that folder) declares next, react, and react-dom so npm install -g installs those at the package root and require("next") in the standalone server resolves correctly. Do not use npm-publish/.next/standalone/package.json as the install manifest — that file is copied by next build and is not what npm reads when you publish ./npm-publish.

Global install (llm-agent CLI)

After publishing:

npm install -g @frankzye/llm-agent
llm-agent

Optional: PORT=8080 llm-agent, HOSTNAME=127.0.0.1. Data directory follows LLM_TASK_DATA_PATH / .data in the current working directory when the process runs (set cwd or env as needed).


Contributing

This is a personal repo; fork or copy under the terms of the MIT license if you find it useful.