@hitechclaw/clawspark
v2.1.1
Published
One-command OpenClaw installer and CLI for local, CPU-first, and API-backed AI agent deployments for strictly non-commercial use.
Downloads
299
Maintainers
Readme
curl -fsSL https://clawspark.hitechclaw.com/install.sh | bashThat's it. Come back in 5 minutes to a fully working, fully private AI agent that can code, research, browse the web, analyze images, and manage your tasks. Everything runs on your hardware. No cloud APIs, no subscriptions, no telemetry.
Licensing notice Publishing or downloading this package from npm does not grant any commercial right. Use is limited to strictly non-commercial purposes unless separately authorized in writing by the copyright holder.
v2 Preview
A new installer track is available in v2/ for broader deployment targets:
- CPU-first installs for machines without GPUs
- API-first installs using third-party providers
- Hybrid installs that combine local and remote inference
Run it with:
bash v2/install.shSupported provider modes in v2:
ollamaopenaianthropicopenroutergooglecustom
v2 now also reuses the stable installer modules for:
- default skills from
v2/configs/skills.yaml - local Whisper voice setup
- WhatsApp or Telegram onboarding
- security hardening and token generation
Useful v2 examples:
bash v2/install.sh --runtime=local-cpu --provider=ollama --messaging=skip
bash v2/install.sh --runtime=api-only --provider=openai --api-key=<your-key>
bash v2/install.sh --runtime=hybrid --provider=openrouter --messaging=telegram
bash v2/install.sh --runtime=api-only --provider=custom --provider-name="My Gateway" --base-url=https://llm.example.com/v1 --api-key=<your-key> --model=my-modelThis does not replace the main installer yet. It is a separate v2 track for CPU and external API support.
When only v2 is installed, the clawspark CLI now reads state from ~/.clawspark-v2 automatically.
If both tracks exist, select explicitly with:
CLAWSPARK_PROFILE=v2 clawspark status
CLAWSPARK_PROFILE=standard clawspark statusFor remote and custom providers, clawspark model list now shows the active provider context and configured API endpoint instead of only local Ollama inventory.
clawspark status also performs a lightweight remote endpoint probe for API-backed profiles.
You can also update remote provider settings after install with commands such as clawspark provider set openai --base-url=https://api.openai.com/v1 --api-key=<your-key> or clawspark provider set custom --name="My Gateway" --base-url=https://llm.example.com/v1 --api-key=<your-key>.
Use clawspark provider list to see the built-in provider catalog and default endpoints.
Use clawspark provider use <provider> to switch quickly using the built-in default endpoint for that provider.
Running clawspark provider use with no provider now opens a provider selection flow.
Use clawspark provider doctor to validate the active provider configuration, expected env vars, endpoint reachability, and model/provider alignment.
Use clawspark provider doctor --json for automation-friendly diagnostics output.
What is this?
OpenClaw is the most popular open-source AI agent (340K+ stars). clawspark gets it running on your NVIDIA hardware in one command. Fully local. Fully private. Your data never leaves your machine.
clawspark is built around OpenClaw. It is not "nemoclaw". Nemotron appears in this repository only as a model option on some hardware profiles, not as the agent framework name.
What happens when you run it:
- Detects your hardware (DGX Spark, Jetson, RTX GPUs, Mac)
- Picks the best model using llmfit for hardware-aware selection
- Installs everything (Ollama, OpenClaw, 10 skills, dependencies)
- Configures multi-model (chat + vision + optional image generation)
- Enables voice (local Whisper transcription, zero cloud)
- Sets up browser automation (headless Chromium)
- Sets up your dashboard (chat UI + metrics)
- Creates systemd services (auto-starts on boot)
- Hardens security (firewall, auth tokens, localhost binding, Docker sandbox)
Supported Hardware
| Hardware | Memory | Default Model | Tokens/sec | |---|---|---|---| | DGX Spark | 128 GB unified | Qwen 3.5 35B-A3B | ~59 (measured) | | Jetson AGX Thor | 128 GB unified | Auto-selected | Community testing | | Jetson AGX Orin | 64 GB unified | Auto-selected | Community testing | | RTX 5090 / 4090 | 24-32 GB VRAM | Auto-selected | Community testing | | RTX 4080 / 4070 | 8-16 GB VRAM | Auto-selected | Community testing | | Mac M1/M2/M3/M4 | 16-128 GB unified | Auto-selected | Community testing |
NVIDIA platforms use llmfit to detect your hardware and pick the best model. macOS uses a curated fallback list.
Quick Start
The installer asks 3 questions:
[1/3] Which model? > 5 models ranked by hardware fit
[2/3] Messaging platform? > WhatsApp / Telegram / Both / Skip
[3/3] Tailscale? > Yes (remote access) / NoZero interaction mode:
curl -fsSL https://clawspark.hitechclaw.com/install.sh | bash -s -- --defaultsWhat Your Agent Can Do
| Capability | How it Works | |---|---| | Answer questions | Local LLM via Ollama | | Search the web | Built-in web search + DuckDuckGo, no API key | | Deep research | Sub-agents run parallel research threads | | Browse websites | Headless Chromium (navigate, click, fill forms, screenshot) | | Analyze images | Vision model for screenshots, photos, diagrams | | Write and run code | exec + read/write/edit tools | | Voice notes | Local Whisper transcription for WhatsApp voice messages | | File management | Read, write, edit, search files on the host | | Scheduled tasks | Cron-based automation | | Sub-agent orchestration | Spawn parallel background agents |
All of this runs locally. No data leaves your machine.
Skills
10 verified skills ship by default. Install curated bundles:
clawspark skills pack research # Deep research + web search (4 skills)
clawspark skills pack coding # Code generation + review (2 skills)
clawspark skills pack productivity # Task management + knowledge (3 skills)
clawspark skills pack voice # Voice interaction (2 skills)
clawspark skills pack full # Everything (10 skills)Manage individual skills:
clawspark skills add <name> # Install a skill
clawspark skills remove <name> # Remove a skill
clawspark skills sync # Apply skills.yaml changes
clawspark skills audit # Security scan installed skillsMulti-Model
Three model slots:
| Slot | Purpose | Example |
|---|---|---|
| Chat | Conversation and coding | ollama/qwen3.5:35b-a3b |
| Vision | Image analysis | ollama/qwen2.5-vl:7b |
| Image gen | Create images (optional) | Local ComfyUI or API |
clawspark model list # Show all models
clawspark model switch <model> # Change chat model
clawspark model vision <model> # Set vision modelSecurity
- UFW firewall (deny incoming by default)
- 256-bit auth token for the gateway API
- Gateway binds to localhost only
- Code-level tool restrictions (21 blocked command patterns)
- SOUL.md + TOOLS.md with immutable guardrails
- Plugin approval hooks (user confirmation before acting)
- Optional Docker sandbox (no network, read-only root, all caps dropped)
- Air-gap mode:
clawspark airgap on - OpenAI-compatible API gateway for local-first workflows
Skill security audit -- scans installed skills for 30+ malicious patterns (credential theft, exfiltration, obfuscation). Protects against ClawHub supply chain attacks:
clawspark skills auditDiagnostics
Full system health check across hardware, GPU, Ollama, OpenClaw, skills, ports, security, and logs:
clawspark diagnose # alias: clawspark doctorGenerates a shareable debug report at ~/.clawspark/diagnose-report.txt.
CLI Reference
clawspark status Show system health
clawspark start Start all services
clawspark stop [--all] Stop services (--all includes Ollama)
clawspark restart Restart everything
clawspark update Update OpenClaw, re-apply patches
clawspark benchmark Run performance benchmark
clawspark model list|switch|vision Manage models
clawspark provider [show|list|doctor|set|use] Manage API provider settings
clawspark skills sync|add|remove|pack|audit Manage skills
clawspark sandbox on|off|status|test Docker sandbox
clawspark tools list|enable|disable Agent tools
clawspark mcp list|setup|add|remove MCP servers
clawspark tailscale setup|status Remote access
clawspark airgap on|off Network isolation
clawspark diagnose System diagnostics
clawspark logs Tail gateway logs
clawspark uninstall Remove everythingDashboard
Two web interfaces out of the box:
- Chat UI:
http://localhost:18789/__openclaw__/canvas/ - Metrics:
http://localhost:8900(ClawMetry)
Both bind to localhost. Use Tailscale for remote access.
Docker Sandbox
Optional isolated code execution for sub-agents:
clawspark sandbox on # Enable
clawspark sandbox off # Disable
clawspark sandbox test # Verify isolationContainers run with no network, read-only root, all capabilities dropped, custom seccomp profile, and memory/CPU limits.
Uninstall
clawspark uninstallRemoves all services, models, and config. Conversations preserved in ~/.openclaw/backups/ unless you pass --purge.
Testing
73 tests using bats:
bash tests/run.sh| Suite | Tests | Coverage |
|---|---|---|
| common.bats | 27 | Logging, colors, helpers |
| skills.bats | 16 | YAML parsing, add/remove, packs |
| security.bats | 11 | Token generation, permissions, deny lists |
| cli.bats | 19 | Version, help, routing, error handling |
npm Package
@hitechclaw/clawspark can also be distributed as an npm package for installing the CLI entrypoint.
The installed command remains clawspark:
npm install -g @hitechclaw/clawsparkLocal package validation:
npm run pack:checkPublic publishing is automated through .github/workflows/publish-npm.yml and expects an NPM_TOKEN repository secret.
Regular validation for pushes and pull requests runs through .github/workflows/ci.yml.
Example setup:
- Add repository secret
NPM_TOKENin GitHub Actions secrets. - Make sure the token belongs to npm user
hitechclaw. - Make sure the tag matches
package.jsonversion exactly, for examplev2.0.0for version2.0.0. - Push the tag.
- GitHub Actions validates metadata, checks the packed tarball, verifies the npm account, and publishes
@hitechclaw/clawsparkto npmjs.com with provenance. - GitHub Actions also creates a GitHub Release and attaches the generated npm tarball.
Example release flow:
npm version patch
git push origin main --follow-tagsRelease helper:
npm run release:patch
npm run release:minor
npm run release:majorOr run the helper directly:
bash scripts/release.sh patch --pushOr manually:
git tag v2.0.0
git push origin v2.0.0Acknowledgements
- OpenClaw -- AI agent framework
- Ollama -- Local LLM inference
- llmfit -- Hardware-aware model selection
- Baileys -- WhatsApp Web client
- Whisper -- Speech-to-text
- ClawMetry -- Observability dashboard
- Qwen -- The model family that runs great on DGX Spark
Maintainers
Contributing
See CONTRIBUTING.md for local validation, PR expectations, and release guidelines.
PRs welcome. Areas where help is needed:
- Testing on Jetson variants and RTX GPUs
- Hardware detection for more GPU models
- Additional messaging platform integrations
- New skills and skill packs
- Sandbox improvements
License
Non-commercial use only. Individuals, companies, and organizations may use this repository only for strictly non-commercial purposes. Commercial use, commercial consulting, and other for-profit use are prohibited unless you have prior written permission from the copyright holder. Exclusive commercialization rights are reserved to the repository owner/copyright holder.
Allowed without separate permission:
- personal use
- private use
- internal company use for strictly non-commercial purposes
- internal organizational use for strictly non-commercial purposes
- educational use
- non-commercial research use
- non-commercial evaluation and testing
Not allowed without prior written permission:
- product commercialization of this repository or derivative offerings
- commercial consulting or implementation services
- third-party service delivery or client work
- paid support, training, integration, or managed services
- resale, sublicensing, hosting, SaaS, bundling, or monetization
Non-commercial use means use that is not related to revenue generation,
commercial advantage, product commercialization, paid delivery, or service to
clients or third parties. A company or organization may use the project only
for strictly non-commercial internal purposes such as research, education,
testing, or evaluation.
Only the repository owner / copyright holder may commercialize this project or grant commercial licenses.
See LICENSE, NOTICE, and COMMERCIAL-LICENSE.md.
