@gzmagyari/kanbanboard
v1.0.7
Published
AI-powered Kanban dashboard with LLM integration, Claude Code agents, and MCP server support
Maintainers
Readme
Jarvis Kanban Dashboard
A lightweight kanban board for managing tasks for Jarvis/Clawdbot — now with Projects + basic task hierarchy (sub-tickets).
- Frontend: Vue 2 + Vuetify 2 (CDN, no build step)
- Backend: Node.js (Express) + SQLite
- Drag & drop: SortableJS via vuedraggable
Columns
- Ideas
- To do
- In Progress
- Review
- Testing
- Done
Jarvis should only pick up tasks in To do and In Progress.
Run
cd /home/clawdbot/clawd/kanban-dashboard
npm install
HOST=0.0.0.0 PORT=3000 npm startOpen: http://<server-ip>:3000/
API (summary)
GET /api/healthGET /api/columnsGET /api/ai/health(shows whether LLM is configured/enabled)POST /api/ai/tasks/merge(AI create: enhance + optional breakdown into sub-tickets; returns plan + created tasks when applied)POST /api/ai/projects/:id/evolve(AI: propose new tickets based on current state)POST /api/ai/projects/:id/remove-concept(AI: remove/unmerge a concept from the hierarchy)POST /api/ai/projects/:id/init(AI: generate/overwrite project scope from a repo path; optional seed tickets)GET /api/tasks/:id/ancestors(optional?include_self=1)GET /api/tasks/:id/descendants(optional?mode=flat|tree&include_self=1)GET /api/tasks/:id/contextPOST /api/tasks/:id/next(select next leaf sub-ticket; LLM-assisted when enabled, heuristic fallback)GET /api/projectsPOST /api/projectsGET /api/projects/:idPATCH /api/projects/:idDELETE /api/projects/:id(reassigns tickets to default project, then deletes)GET /api/tasks(optional?status=todo&project_id=default&parent_id=<id>)POST /api/tasks(supportsproject_id+parent_id)GET /api/tasks/:idPATCH /api/tasks/:id(supportsstatus,project_id,parent_id)POST /api/tasks/:id/commentsPOST /api/tasks/:id/updates(append-only progress log)GET /api/tasks/:id/updates?limit=5
Database
SQLite file: kanban-dashboard/data/kanban.db
Tables:
projects(project scope/context lives here)tasksautomation_tasks(recurring/scheduled; now also linked to projects)commentsprogress_updatesllm_plans(stores validated AI "plans" and whether they were applied)llm_runs(stores raw LLM request/response/error logs for auditing)
LLM / AI (Phase 3)
Phase 3 adds:
llm_runstable (logs LLM requests/responses/errors)/api/ai/healthPOST /api/tasks/:id/next(LLM-assisted next-leaf selection)
Environment variables
The backend uses an OpenAI-compatible Chat Completions endpoint (works with LiteLLM Proxy, OpenRouter direct, etc).
Common setup (LiteLLM proxy):
LLM_ENABLED=1
LLM_BASE_URL=http://127.0.0.1:4000
LLM_API_KEY=sk-... # optional depending on your proxy
LLM_MODEL=openrouter/google/gemini-3-pro-previewOptional tuning:
LLM_TIMEOUT_MS=45000
LLM_TEMPERATURE=0AI create (Phase 4)
POST /api/ai/tasks/merge accepts:
{
"project_id": "default",
"parent_id": null,
"status": "todo",
"text": "Add OAuth login...\n\nMore details...",
"breakdown": true,
"max_depth": 3,
"apply": true
}Response (when apply=true) returns:
plan_idllm_run_idroot_taskcreated_tasks[]
Guardrails / safety (Phase 5)
Optional API key for AI endpoints
If set, all /api/ai/* endpoints require X-API-Key: <value>.
AI_API_KEY=changemeRate limiting (in-memory)
Simple per-IP rate limiting for AI/LLM calls.
AI_RATE_LIMIT_PER_MIN=60Project init repo scanning safety
By default, init can only read paths under the server working directory. You can allowlist additional roots:
AI_INIT_ALLOWED_ROOTS=/home/clawdbot,/mnt/data
AI_INIT_MAX_FILES=400
AI_INIT_MAX_FILE_BYTES=64000
AI_INIT_MAX_TOTAL_BYTES=600000To allow any path (dangerous on shared machines):
AI_INIT_ALLOW_ANY=1