@meego-harness/llm-worker
v0.3.0
Published
OpenAI-compatible LLM worker bridge for meego-harness WorkerServerSDK
Keywords
Readme
@meego-harness/llm-worker
OpenAI-compatible LLM worker bridge for meego-harness.
This package logs in to a worker server with role llm, receives normal worker
text messages, calls an OpenAI-compatible chat completions endpoint, and returns
normal worker text responses.
It does not execute manager tools or implement manager-agent-specific metadata protocols. manager-agent owns the worker model prompt, response parsing, and local tool execution loop.
worker-cli usage
The package declares a meegoHarnessWorker manifest and can be installed by
@meego-harness/worker-cli as a private package. Its setup action stores local
LLM connection settings and returns only worker-cli metadata.
meego-harness-worker onboard llm
meego-harness-worker start --worker llm-worker-1 --tmuxThe package bin is also available directly after installation:
meego-llm-worker setup
meego-llm-worker start --worker llm-worker-1The API key and custom headers are kept in the worker config file and are not included in setup/list/doctor JSON output.
Library usage
import { OpenAICompatibleLlmWorkerBridge } from '@meego-harness/llm-worker'
const bridge = new OpenAICompatibleLlmWorkerBridge({
serverUrl: 'ws://127.0.0.1:3000/workers',
workerId: 'llm-worker',
email: '[email protected]',
apiKey: process.env.LLM_API_KEY!,
baseURL: 'https://llm.example.com/v1',
model: 'model-id',
})
await bridge.start()