omni-agent-cli
v2.0.0
Published
Universal AI CLI Agent — any provider, any model, full filesystem power
Readme
◈ OmniAgent — Universal AI CLI Agent
A powerful, beautiful CLI agent that works with any AI provider and has full filesystem + shell capabilities.
██████╗ ███╗ ███╗███╗ ██╗██╗ ███████╗ ██████╗ ███████╗███╗ ██╗████████╗
██╔═══██╗████╗ ████║████╗ ██║██║ ██╔════╝██╔════╝ ██╔════╝████╗ ██║╚══██╔══╝
██║ ██║██╔████╔██║██╔██╗ ██║██║ ███████╗██║ ███╗█████╗ ██╔██╗ ██║ ██║
██║ ██║██║╚██╔╝██║██║╚██╗██║██║ ╚════██║██║ ██║██╔══╝ ██║╚██╗██║ ██║
╚██████╔╝██║ ╚═╝ ██║██║ ╚████║██║ ███████║╚██████╔╝███████╗██║ ╚████║ ██║
╚═════╝ ╚═╝ ╚═╝╚═╝ ╚═══╝╚═╝ ╚══════╝ ╚═════╝ ╚══════╝╚═╝ ╚═══╝ ╚═╝ 🚀 Quick Start
# Install dependencies
npm install
# First run — guided setup wizard
node index.js
# Or set up separately
node index.js --setup🔌 Supported Providers
| Provider | Format | Notes | |----------|--------|-------| | Anthropic (Claude) | Native | claude-opus, sonnet, haiku | | OpenAI | OpenAI | gpt-4o, o1, o3-mini | | Groq | OpenAI | Ultra fast inference | | xAI (Grok) | OpenAI | grok-2, grok-beta | | DeepSeek | OpenAI | deepseek-chat, reasoner | | Mistral AI | OpenAI | mistral-large | | Together AI | OpenAI | Llama, Qwen, etc. | | OpenRouter | OpenAI | All models via one key | | io.net | OpenAI | Decentralized inference | | Ollama | OpenAI | Local models | | Custom | OpenAI | Any OpenAI-compatible API (kiai.ai, etc.) |
💻 Usage
# Start in current directory
node index.js
# Start in specific directory
node index.js /path/to/project
# Global install (optional)
npm link
omni-agent /path/to/project⌨️ Commands
| Command | Description |
|---------|-------------|
| /help | Show all commands |
| /clear | Clear screen + conversation history |
| /stats | Show current session statistics |
| /cd <dir> | Change working directory |
| /model <name> | Switch model on the fly |
| /setup | Reconfigure provider/API key/model |
| /reset | Start new conversation (clear history) |
| /history | Show conversation history |
| /exit | Exit and show session statistics |
🛠️ Agent Tools
The agent has access to 13 filesystem and shell tools:
| Tool | Description |
|------|-------------|
| list_directory | List files with metadata |
| read_file | Read file content |
| write_file | Create/overwrite files |
| append_to_file | Append to files |
| patch_file | Surgically edit text in files |
| delete_path | Delete files/directories |
| copy_path | Copy files |
| move_path | Move/rename files |
| create_directory | Create directories recursively |
| get_file_info | Get file metadata |
| find_files | Find files by glob pattern |
| search_in_files | Grep search in files |
| execute_command | Run any shell command |
⚡ Token Efficiency
OmniAgent uses parallel tool calling — when the AI needs multiple independent pieces of information, it fetches them ALL in a single API call instead of making sequential requests.
Example: "List the src/ folder and read package.json"
- ❌ Naive: 3 API calls (list → read → answer)
- ✅ OmniAgent: 2 API calls (list+read simultaneously → answer)
📊 Session Statistics
On exit (or /stats), OmniAgent shows:
- Session duration
- Messages sent/received
- API requests made
- Input/output token counts
- Estimated cost
- Tool usage breakdown with visual bars
⚙️ Config File
Config is saved to ~/.omni-agent.json:
{
"providerKey": "groq",
"baseURL": "https://api.groq.com/openai/v1",
"apiKey": "your-key-here",
"model": "llama-3.3-70b-versatile",
"format": "openai",
"maxTokens": 8192,
"temperature": 0.3
}💡 Example Prompts
⟩ Refactor all the .js files in src/ to use async/await instead of callbacks
⟩ Find all TODO comments in this project and create a TODO.md file
⟩ Run the tests and fix any failures
⟩ Create a REST API with Express for a simple todo app
⟩ Analyze this codebase and suggest improvements
⟩ Set up a Python virtual environment and install requirements.txtBuilt with Node.js · chalk · gradient-string · figlet · ora · inquirer · boxen
