hey-comma
v1.3.0
Published
Run shell commands using natural language
Downloads
66
Readme
Table of Contents
About
Use natural language to turn tasks into shell commands.
Just say what you want to do and hey, will generate the command for you.
Features
- use natural language to run shell commands
- explain files, scripts, and piped input with your configured model
- caches successful commands to speed up future runs
Why?
Shell scripts are powerful, but only if you remember the exact commands. hey, is for the moments when you know what you want, but not the syntax.
Always forget the command to pack a directory into a tarball? Just say it:
hey, create a tarball with all files in the current directory, except javascript filesInstall
Binary install
curl -fsSL https://raw.githubusercontent.com/TimoBechtel/hey-comma/main/install.sh | bashSupported binaries:
- macOS arm64
- macOS x64
- Linux x64
- Windows x64
npm install
Requires Node.js 22 or later when installed via npm.
npm i -g hey-comma[!NOTE]
pnpmdoes not like the comma, so only theheyalias is available. You can add it manually:alias hey,=hey
Setup
AI provider setup
hey, works with openai, anthropic, google, and openrouter.
Then, run:
hey, setupand follow the prompts. This creates ~/.hey-comma/config.toml.
If you prefer environment variables, set the key and point config to it:
export OPENROUTER_API_KEY=...
hey, config set openrouter_api_key "env:OPENROUTER_API_KEY"Usage
hey, currently has two modes: run and explain. Most of the time you don't need to specify the mode specifically, as hey, will automatically detect the mode based on whether you pipe data to it or not.
hey, run
hey, run is the default mode. It will convert your instruction to a shell command and run it. It will always ask for confirmation before running the command.
hey, create a tarball with all files in the current dir, except js filesYou can explicitly specify the mode:
hey, run: initialize a next.js project in ./my-app(colon is optional)
hey, explain
hey, explain will explain the data you pipe to it.
[!IMPORTANT] Piped data is sent to your configured provider. Do not pipe secrets you would not send to that service.
cat mysterious.sh | hey, is this safe to runYou can explicitly specify the mode:
cat script.sh | hey, explain: what does this do(colon is optional)
Special characters
To pass special characters to the hey,, you can pass them as a quoted string:
hey, "what is the most recent file in ~/Documents?"Configuration
You can configure hey, using the hey, config command. Or by editing the config.toml directly. To get the path to the config file, run:
hey, config pathFor example, ~/.hey-comma/config.toml
Available options:
default_provider: default provider (openai,anthropic,google,openrouter)default_model: default model name for your default providermodel_aliases: alias map for model selectors (e.g.smart = "anthropic/claude-sonnet-4-5")openai_api_key: OpenAI API key (orenv:OPENAI_API_KEY)anthropic_api_key: Anthropic API key (orenv:ANTHROPIC_API_KEY)google_api_key: Google API key (orenv:GOOGLE_API_KEY)openrouter_api_key: OpenRouter API key (orenv:OPENROUTER_API_KEY)openrouter_base_url: OpenRouter base URL (default:https://openrouter.ai/api/v1)disable_thinking: disable provider reasoning/thinking modes where supported (default:false)temperature: the temperature to use when generating commands (default:0.2)max_tokens: the maximum number of tokens to generate (default:1200)run_prompt: the prompt to use when generating commands (see Custom prompts)explain_prompt: the prompt to use when explaining data (see Custom prompts)cache.max_entries: the maximum number of entries to cache (default:50)
Model selector
Use --model when you want to override your defaults:
hey, run --model anthropic/claude-sonnet-4-5 "create a tarball from this folder"
hey, explain --model openrouter/openai/gpt-4o "what does this script do?"Accepted forms:
- full selector:
<provider>/<model> - alias from
model_aliases - bare model name (resolved with
default_provider)
Example aliases:
[model_aliases]
fast = "openai/gpt-4o-mini"
smart = "anthropic/claude-sonnet-4-5"
cheap = "google/gemini-2.5-flash"Custom prompts
You can customize the prompts used by hey, by setting the run_prompt and explain_prompt options. See prompts.ts for the default prompts.
[!IMPORTANT] Make sure to add the placeholders (e.g.
%INSTRUCTION%) to your custom prompts.
The following placeholders are available:
%INSTRUCTION%: the instruction that is passed tohey, runorhey, explain%SHELL%: the current shell (e.g.bashorzsh) (only available forhey, run)%INPUT%: the data that is piped tohey, explain(only available forhey, explain)
Data sent to providers
hey, sends this data to your configured provider:
- The command you want to run
- The data you pipe to
hey, explain - Your current shell (e.g.
bashorzsh)
Migration to v2
v2 is intentionally breaking. There is no compatibility layer for old flags or old config keys.
Removed:
--gpt4openai_model- OpenAI-only runtime behavior
Use this instead:
--model <provider/model>(or alias / bare model with default provider)default_provider,default_model, and[model_aliases]in config
More usage examples
hey, what are the largest files in my download directorycat salaries.csv | hey, what is the average salary of people with a PhDcat script.sh | hey, explainContributing
Development
This project uses bun as package manager and compiler.
If you don't have bun installed, run:
curl -fsSL https://bun.sh/install | bashInstall dependencies
bun installBuild
bun run buildTo build all release binaries:
bun run build:allCommit messages
This project uses semantic-release for automated release versions. So commits in this project follow the Conventional Commits guidelines. I recommend using commitizen for automated commit messages.
Release publishing
main releases use semantic-release with npm Trusted Publishing (OIDC).
Before publishing, configure TimoBechtel/hey-comma as a trusted publisher in npm package settings.
