npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@opensecret/maple-openclaw-plugin

v0.1.1

Published

OpenClaw plugin that runs Maple TEE-backed AI models via maple-proxy

Readme

@opensecret/maple-openclaw-plugin

OpenClaw plugin that automatically downloads, configures, and runs maple-proxy as a background service. All AI inference runs inside Maple's TEE (Trusted Execution Environment) secure enclaves.

Quick Start (Recommended)

Install the plugin and let your agent handle the rest:

openclaw plugins install @opensecret/maple-openclaw-plugin

Then tell your agent:

Install and configure maple-proxy with my API key: YOUR_MAPLE_API_KEY

The plugin bundles a skill that teaches the agent how to set up the maple provider, configure models, and enable embeddings. After a gateway restart, the agent will have all the context it needs from the skill to complete the setup. If the plugin isn't configured yet, the maple_proxy_status tool also returns step-by-step instructions.

Manual Setup

If you prefer to configure everything yourself, follow these steps after installing the plugin.

1. Configure the plugin

Set your Maple API key in openclaw.json:

{
  "plugins": {
    "entries": {
      "maple-openclaw-plugin": {
        "enabled": true,
        "config": {
          "apiKey": "YOUR_MAPLE_API_KEY"
        }
      }
    }
  }
}

2. Add the Maple provider

Add a maple provider so OpenClaw can route requests to the local proxy (default port 8787):

{
  "models": {
    "providers": {
      "maple": {
        "baseUrl": "http://127.0.0.1:8787/v1",
        "apiKey": "YOUR_MAPLE_API_KEY",
        "api": "openai-completions",
        "models": [
          { "id": "kimi-k2-5", "name": "Kimi K2.5 (recommended)" },
          { "id": "deepseek-r1-0528", "name": "DeepSeek R1" },
          { "id": "gpt-oss-120b", "name": "GPT-OSS 120B" },
          { "id": "llama-3.3-70b", "name": "Llama 3.3 70B" },
          { "id": "qwen3-vl-30b", "name": "Qwen3 VL 30B" }
        ]
      }
    }
  }
}

Use the same Maple API key in both places. To discover all available models, call GET http://127.0.0.1:8787/v1/models after startup.

3. Add models to the allowlist (if applicable)

If you have an agents.defaults.models section in your config, add the maple models you want. If you don't have this section, skip this step -- all models are allowed by default.

{
  "agents": {
    "defaults": {
      "models": {
        "maple/kimi-k2-5": {},
        "maple/deepseek-r1-0528": {},
        "maple/gpt-oss-120b": {},
        "maple/llama-3.3-70b": {},
        "maple/qwen3-vl-30b": {}
      }
    }
  }
}

4. Restart the gateway

systemctl restart openclaw.service

Plugin config changes always require a full gateway restart. Model and provider config changes hot-apply without a restart.

Usage

Use maple models by prefixing with maple/:

  • maple/kimi-k2-5 (recommended)
  • maple/deepseek-r1-0528
  • maple/gpt-oss-120b
  • maple/llama-3.3-70b
  • maple/qwen3-vl-30b

The plugin also registers a maple_proxy_status tool that shows the proxy's health, port, version, and available endpoints. If the plugin isn't configured yet, the tool returns setup instructions.

Embeddings & Memory Search

maple-proxy serves an OpenAI-compatible embeddings endpoint using the nomic-embed-text model. You can use this for OpenClaw's memory search so embeddings are generated inside the TEE -- no cloud embedding provider needed.

Enable the memory-core plugin

The memory_search and memory_get tools come from OpenClaw's memory-core plugin. It ships as a stock plugin but must be explicitly enabled:

{
  "plugins": {
    "allow": ["memory-core"],
    "entries": {
      "memory-core": {
        "enabled": true
      }
    }
  }
}

Configure memorySearch

Important: The model field must be nomic-embed-text (without a maple/ prefix). Using maple/nomic-embed-text will cause 400 errors.

{
  "agents": {
    "defaults": {
      "memorySearch": {
        "enabled": true,
        "provider": "openai",
        "model": "nomic-embed-text",
        "remote": {
          "baseUrl": "http://127.0.0.1:8787/v1/",
          "apiKey": "YOUR_MAPLE_API_KEY"
        }
      }
    }
  }
}

Restart and reindex

systemctl restart openclaw.service
openclaw memory index --verbose
openclaw memory status --deep

The status output should show Embeddings: available and Vector: ready.

Troubleshooting

| Problem | Cause | Fix | |---|---|---| | "memory slot plugin not found" | memory-core not enabled | Add to plugins.allow and plugins.entries, restart | | Embeddings 400 error | Model has provider prefix | Change maple/nomic-embed-text to nomic-embed-text | | Embeddings 401 error | Wrong API key | Check the key is the actual value, not a placeholder | | "Batch: disabled" in status | Too many embedding failures | Fix config, restart to reset failure counter | | Only some files indexed | Embeddings were failing during indexing | Fix config, restart, run openclaw memory index --verbose |

Plugin Config Options

| Option | Default | Description | |---|---|---| | apiKey | (required) | Your Maple API key | | port | 8787 | Local port for the proxy | | backendUrl | https://enclave.trymaple.ai | Maple TEE backend URL | | debug | false | Enable debug logging | | version | (latest) | Pin to a specific maple-proxy version |

Updating

openclaw plugins update maple-openclaw-plugin

Note: openclaw plugins update works for stable releases. To move between beta versions, reinstall with the full version: openclaw plugins install @opensecret/[email protected]

Direct API Access

  • GET http://127.0.0.1:8787/v1/models -- List available models
  • POST http://127.0.0.1:8787/v1/chat/completions -- Chat completions (streaming and non-streaming)
  • POST http://127.0.0.1:8787/v1/embeddings -- Generate embeddings (model: nomic-embed-text)
  • GET http://127.0.0.1:8787/health -- Health check

License

MIT