rapid-lama
v1.0.2
Published
A TypeScript CLI tool
Maintainers
Readme
Rapid Lama
rlama is a lightweight CLI for fast, single-turn prompts against an Ollama server.
What It Does
- Sends one prompt per command execution.
- Uses Ollama's
/api/generateendpoint withstream: false. - Prints the model response directly to stdout.
Requirements
- Node.js 18+
- An Ollama server running locally or remotely
curlavailable in your shell
Installation
As a global CLI
npm install -g rapid-lama
rlama "Summarize recursion in one paragraph."From source (local development)
pnpm install
pnpm buildAfter building, run:
node dist/cli.js "What is the capital of Italy?"Usage
rlama [options] <message>Examples:
rlama "Explain HTTP status code 429"
rlama "Write a short commit message for a bug fix"Help:
rlama --helpEnvironment Variables
OLLAMA_HOST: Ollama base URL. Default:http://localhost:11434OLLAMA_MODEL: Model name. Default:llama3.2OLLAMA_SYSTEM: Optional system prompt override
Example:
OLLAMA_MODEL=llama3.1 OLLAMA_HOST=http://127.0.0.1:11434 rlama "Give me 3 Linux tips"Development Scripts
pnpm build: Compile TypeScript and markdist/cli.jsexecutablepnpm dev: Rundist/cli.jsdirectly
License
MIT
