owo-cli
v1.7.1
Published
Natural language to shell commands using AI
Downloads
1,169
Maintainers
Readme
What is this?
owo is a lightweight, focused CLI tool that converts natural language into shell commands using Large Language Models (LLMs) like GPT-5. Unlike comprehensive agentic development tools like Claude Code or Cursor, owo has a simple, singular purpose: helping you write shell commands faster, without switching context.
owo is not a replacement for comprehensive agentic development tools -- it is simple tool that excels at one thing. Consider it the terminal equivalent of quickly searching "how do I..." and getting an immediately runnable answer.

After a response is generated, you can edit it before pressing enter to execute the command. This is useful if you want to add flags, or other modifications to the command.
Installation
Option A: npm (Recommended)
Requires Node.js 18+.
npm install -g owo-cliOption B: Homebrew (macOS / Linux)
brew install ibealec/owo/owo-cliOption C: Build from source
Requires Bun.
git clone https://github.com/ibealec/owo.git
cd owo
bun install
bun run build
chmod +x dist/owo-cli
mv dist/owo-cli /usr/local/bin/owo-cliQuick Start
The fastest way to get started is with the interactive setup wizard:
owo setupThis walks you through picking a provider, entering your API key, choosing a model, and enabling optional features. It writes the config file for you.
Alternatively, you can skip setup entirely with inline flags:
owo -p openai -k sk-your-key -m gpt-4.1 list all files larger than 100MBConfiguration
owo is configured through a single config.json file. The first time you run owo, it will automatically create a default configuration file to get you started.
Configuration File Location
The config.json file is located in a standard, platform-specific directory:
- Linux:
~/.config/owo/config.json - macOS:
~/Library/Preferences/owo/config.json - Windows:
%APPDATA%\\owo\\config.json(e.g.,C:\\Users\\<user>\\AppData\\Roaming\\owo\\config.json)
Provider Types
You can configure owo to use different AI providers by setting the type field in your config.json. The supported types are "OpenAI", "Custom", "Claude", "Gemini", "GitHub", and "ClaudeCode".
Below are examples for each provider type.
1. OpenAI (type: "OpenAI")
This is the default configuration.
{
"type": "OpenAI",
"apiKey": "sk-your_openai_api_key",
"model": "gpt-4.1"
}apiKey: Your OpenAI API key. If empty,owowill fall back to theOPENAI_API_KEYenvironment variable.
2. Claude (type: "Claude")
Uses the native Anthropic API.
{
"type": "Claude",
"apiKey": "your-anthropic-api-key",
"model": "claude-3-opus-20240229"
}apiKey: Your Anthropic API key. If empty,owowill fall back to theANTHROPIC_API_KEYenvironment variable.
3. Gemini (type: "Gemini")
Uses the native Google Gemini API.
{
"type": "Gemini",
"apiKey": "your-google-api-key",
"model": "gemini-pro"
}apiKey: Your Google AI Studio API key. If empty,owowill fall back to theGOOGLE_API_KEYenvironment variable.
4. GitHub (type: "GitHub")
Uses multiple free to use GitHub models.
{
"type": "GitHub",
"apiKey": "your-github-token",
"model": "openai/gpt-4.1-nano"
}apiKey: Your GitHub token. If empty,owowill fall back to theGITHUB_TOKENenvironment variable.
5. Claude Code (type: "ClaudeCode")
Uses the locally installed Claude Code CLI. No API key needed -- Claude Code handles its own authentication.
{
"type": "ClaudeCode",
"model": "sonnet"
}model: Optional. The model to use (e.g.,"sonnet","opus"). If omitted, Claude Code uses its default.- Requires: The
claudeCLI to be installed and authenticated.
6. Custom / Local Models (type: "Custom")
This type is for any other OpenAI-compatible API endpoint, such as Ollama, LM Studio, or a third-party proxy service.
{
"type": "Custom",
"model": "llama3",
"baseURL": "http://localhost:11434/v1",
"apiKey": "ollama"
}model: The name of the model you want to use (e.g.,"llama3").baseURL: The API endpoint for the service.apiKey: An API key, if required by the service. For local models like Ollama, this can often be a non-empty placeholder like"ollama".
Context Configuration (Optional)
owo can include recent command history from your shell to provide better context for command generation. This feature is disabled by default but can be enabled. When enabled, owo includes the raw last N lines from your shell history (e.g., bash, zsh, fish), preserving any extra metadata your shell records:
{
"type": "OpenAI",
"apiKey": "sk-your_api_key",
"model": "gpt-4.1",
"context": {
"enabled": true,
"maxHistoryCommands": 10
}
}enabled: Whether to include command history context (default:false)maxHistoryCommands: Number of recent commands to include (default:10) When enabled,owoautomatically detects and parses history from bash, zsh, and fish shells.
Notes on history scanning performance
- Chunk size unit: When scanning shell history files,
oworeads from the end of the file in fixed-size chunks of 64 KiB. This is not currently configurable but can be made if desired.
Windows notes
- History detection: On Windows,
owosearches for PowerShell PSReadLine history at:%APPDATA%\Microsoft\Windows\PowerShell\PSReadLine\ConsoleHost_history.txt(Windows PowerShell 5.x)%APPDATA%\Microsoft\PowerShell\PSReadLine\ConsoleHost_history.txt(PowerShell 7+) If not found, it falls back to Unix-like history files that may exist when using Git Bash/MSYS/Cygwin (e.g.,.bash_history,.zsh_history).
- Directory listing: On Windows, directory listing uses
dir /b; on Linux/macOS it usesls.
Clipboard Integration
owo can automatically copy generated commands to your system clipboard:
{
"type": "OpenAI",
"apiKey": "sk-your_api_key",
"model": "gpt-4.1",
"clipboard": true
}clipboard: Whether to automatically copy generated commands to clipboard (default:false)
When enabled, every command generated by owo is automatically copied to your system clipboard, making it easy to paste commands elsewhere. The clipboard integration works cross-platform:
- macOS: Uses
pbcopy - Windows: Uses
clip - Linux: Uses
xcliporxsel(falls back toxselifxclipis not available)
Note: On Linux, you'll need either xclip or xsel installed for clipboard functionality to work.
Shell Helper Function
This function lets you type owo <description> and get an editable command preloaded in your shell.
zsh
# ~/.zshrc
owo() {
local cmd
cmd="$(owo-cli "$@")" || return
vared -p "" -c cmd
print -s -- "$cmd" # add to history
eval "$cmd"
}After editing ~/.zshrc, reload it:
source ~/.zshrcbash
owo() {
local cmd
cmd="$(owo-cli "$@")" || return
# requires interactive shell and Bash 4+
read -e -i "$cmd" -p "" cmd || return
builtin history -s -- "$cmd"
eval -- "$cmd"
}Powershell / Conhost / Windows Terminal
Note: This only applies to Windows with Powershell installed
To your Powershell profile, add this snippet
function owo {
param(
[Parameter(ValueFromRemainingArguments=$true)]
$args
)
$Source = '
using System;
using System.Runtime.InteropServices;
public class ConsoleInjector {
[StructLayout(LayoutKind.Sequential)]
public struct KEY_EVENT_RECORD {
public bool bKeyDown;
public ushort wRepeatCount;
public ushort wVirtualKeyCode;
public ushort wVirtualScanCode;
public char UnicodeChar;
public uint dwControlKeyState;
}
[StructLayout(LayoutKind.Sequential)]
public struct INPUT_RECORD {
public ushort EventType;
public KEY_EVENT_RECORD KeyEvent;
}
[DllImport("kernel32.dll", SetLastError = true)]
public static extern IntPtr GetStdHandle(int nStdHandle);
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool WriteConsoleInput(
IntPtr hConsoleInput,
INPUT_RECORD[] lpBuffer,
int nLength,
out int lpNumberOfEventsWritten
);
const int STD_INPUT_HANDLE = -10;
const ushort KEY_EVENT = 0x0001;
public static void SendCommand(string text) {
IntPtr hIn = GetStdHandle(STD_INPUT_HANDLE);
var records = new INPUT_RECORD[text.Length];
int i = 0;
for (; i < text.Length; i++) {
records[i].EventType = KEY_EVENT;
records[i].KeyEvent.bKeyDown = true;
records[i].KeyEvent.wRepeatCount = 1;
records[i].KeyEvent.UnicodeChar = text[i];
}
int written;
WriteConsoleInput(hIn, records, i, out written);
}
}';
$cmd = owo-cli @args;
Add-Type -TypeDefinition $Source;
[ConsoleInjector]::SendCommand($cmd)
}This will work for Powershell terminals. To add this functionality to Conhost / Terminal, save this as owo.bat and let it be accessible in PATH (you must do the Powershell step as well). For example,
:: assumes that ECHO ON and CHCP 437 is user preference
@ECHO OFF
CHCP 437 >NUL
POWERSHELL owo %*
@ECHO ONUsage
Once installed and configured:
owo generate a new ssh key called owo-key and add it to the ssh agentYou'll see the generated command in your shell's input line. Press Enter to run it, or edit it first. Executed commands will show up in your shell's history just like any other command.
CLI Flags
All flags can override config values for a single invocation without editing config.json.
Provider Overrides
| Flag | Short | Description |
|------|-------|-------------|
| --provider <type> | -p | Override provider (openai, claude, gemini, github, claudecode, custom) |
| --model <name> | -m | Override model |
| --api-key <key> | -k | Override API key (highest precedence: flag > config > env var) |
| --base-url <url> | | Override base URL for custom/OpenAI-compatible providers |
# Use Claude for a single query without changing config
owo -p claude -m claude-sonnet-4-20250514 find large log files
# Use a local Ollama model
owo -p custom --base-url http://localhost:11434/v1 -m llama3 show disk usageBehavior
| Flag | Short | Description |
|------|-------|-------------|
| --copy | -c | Copy generated command to clipboard |
| --no-copy | | Don't copy to clipboard |
| --history | | Include shell history context |
| --no-history | | Exclude shell history context |
| --history-count <n> | | Number of history commands to include (implies --history) |
# Copy command even if config has clipboard disabled
owo --copy find all zombie processes
# Use history context for a single query
owo --history redo that but with sudoOutput
| Flag | Short | Description |
|------|-------|-------------|
| --exec | -x | Execute the generated command after Execute? [y/N] confirmation |
| --explain | -e | Show a brief explanation of the command on stderr |
| --raw | -r | Suppress all non-command output (clean for piping) |
# Execute with confirmation
owo -x delete all .tmp files older than 7 days
# Get an explanation alongside the command
owo -e find all files larger than 100mb
# Pipe-safe output
result=$(owo -r show my public IP)Debugging
| Flag | Short | Description |
|------|-------|-------------|
| --dry-run | -n | Print the prompt that would be sent without making an API call |
| --verbose | -V | Print diagnostics (provider, model, latency) to stderr |
| --retry <n> | | Override retry count (default: 2) |
# See exactly what would be sent to the API
owo --dry-run find large files
# Debug with full diagnostics
owo -V convert all heic files to jpg
# Fail fast with no retries
owo --retry 0 list disk usageStdin / Pipe Support
owo auto-detects piped input and reads the description from stdin:
echo "find files larger than 100mb" | owoThe -- Separator
Use -- to separate flags from the description when your description looks like a flag:
owo -- -rf delete these filesLicense
Contributing
Contributions are welcome! Please feel free to submit a pull request.
