@zynopssolutions/zynops-cli
v0.1.9
Published
Claude Code opened to any LLM — OpenAI, Gemini, DeepSeek, Ollama, and 200+ models
Maintainers
Readme
u# ZynOps-CLI
ZynOps-CLI is an open-source coding-agent CLI for cloud and local model providers.
Use OpenAI-compatible APIs, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported backends while keeping one terminal-first workflow: prompts, tools, agents, MCP, slash commands, and streaming output.
Quick Start | Setup Guides | Providers | Source Build | VS Code Extension | Community
Why ZynOps
- Use one CLI across cloud APIs and local model backends
- Save provider profiles inside the app with
/provider - Run with OpenAI-compatible services, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported providers
- Keep coding-agent workflows in one place: bash, file tools, grep, glob, agents, tasks, MCP, and web tools
- Use the bundled VS Code extension for launch integration and theme support
⚡ Quick Start
Install
npm install -g @zynopssolutions/zynops-cliIf the install later reports ripgrep not found, install ripgrep system-wide and confirm rg --version works in the same terminal before starting ZynOps.
Launch
zynopsInside ZynOps:
🚀 Provider Setup
OpenAI
export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_API_KEY=sk-your-key-here
export OPENAI_MODEL=gpt-4o
zynops$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_MODEL="gpt-4o"
zynopsLocal Ollama
export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=qwen2.5-coder:7b
zynops$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:11434/v1"
$env:OPENAI_MODEL="qwen2.5-coder:7b"
zynops📡 Supported Providers
| Provider | Setup | Notes |
|---|---|---|
| OpenAI-compatible | /provider or env vars | OpenAI, OpenRouter, DeepSeek, Groq, Mistral, LM Studio, and any /v1-compatible server |
| Gemini | /provider or env vars | API key, access token, or local ADC workflow |
| GitHub Models | /onboard-github | Interactive onboarding with saved credentials |
| Codex | /provider | Uses existing Codex credentials when available |
| Ollama | /provider or env vars | Local inference — no API key required |
| Atomic Chat | Advanced setup | Local Apple Silicon backend |
| Bedrock / Vertex / Foundry | Env vars | Additional integrations for supported environments |
Provider note: Tool quality and behavior varies across providers. Models with strong function-calling support produce the best results. Smaller local models may struggle with long multi-step tool flows.
🧠 Agent Routing
Route different agents to different models — useful for cost optimization or splitting work by model strength.
Add the following to ~/.claude/settings.json:
{
"agentModels": {
"deepseek-chat": {
"base_url": "https://api.deepseek.com/v1",
"api_key": "sk-your-key"
},
"gpt-4o": {
"base_url": "https://api.openai.com/v1",
"api_key": "sk-your-key"
}
},
"agentRouting": {
"Explore": "deepseek-chat",
"Plan": "gpt-4o",
"general-purpose": "gpt-4o",
"frontend-dev": "deepseek-chat",
"default": "gpt-4o"
}
}When no routing match is found, the global provider is used as the fallback.
⚠️ Security note:
api_keyvalues insettings.jsonare stored in plaintext. Keep this file private and never commit it to version control.
🌐 Web Search and Fetch
WebSearch works on all non-Anthropic models via DuckDuckGo by default — no configuration required. For Anthropic-native backends and Codex, ZynOps preserves the provider's native web search behavior.
Note: The DuckDuckGo fallback scrapes search results and may be rate-limited or blocked. For a more reliable option, configure Firecrawl.
WebFetch is supported but may fail on JavaScript-rendered pages or sites that block plain HTTP requests. Firecrawl resolves this.
Optional: Firecrawl Integration
export FIRECRAWL_API_KEY=your-key-hereWith Firecrawl enabled:
WebSearchcan use Firecrawl's search API (DuckDuckGo remains the default free path)WebFetchuses Firecrawl's scrape endpoint — handles JS-rendered pages correctly
Free tier at firecrawl.dev includes 500 credits. The key is optional.
🏗️ Source Build & Local Development
bun install
bun run build
node dist/cli.mjsUseful Commands
| Command | Purpose |
|---|---|
| bun run dev | Development mode |
| bun run smoke | Smoke tests |
| bun run doctor:runtime | Runtime diagnostics |
| bun run verify:privacy | Privacy verification |
| bun test ... | Focused test runs for touched areas |
Repository Structure
zynops/
├── src/ # Core CLI and runtime
├── scripts/ # Build, verification, and maintenance scripts
├── docs/ # Setup, contributor, and project documentation
├── python/ # Standalone Python helpers and tests
├── vscode-extension/zynops-vscode/ # VS Code extension
├── .github/ # Repo automation, templates, and CI
└── bin/ # CLI launcher entrypoints🧩 VS Code Extension
The repo includes a VS Code extension in vscode-extension/zynops-vscode that provides:
- Launch integration — start ZynOps directly from VS Code
- Provider-aware control center — manage providers without leaving the editor
- Theme support — visual integration with your editor theme
📚 Setup Guides
Beginner-friendly:
Advanced:
🔒 Security
If you discover a security issue, please review SECURITY.md before disclosing.
💬 Community
- GitHub Discussions — Q&A, ideas, and community conversation
- GitHub Issues — confirmed bugs and actionable feature requests
🤝 Contributing
Contributions are welcome. For larger changes, open an issue first so scope is clear before implementation begins.
Before submitting, validate your changes with:
bun run build
bun run smoke
bun test ... # focused runs for areas you touched⚖️ Disclaimer
ZynOps is an independent community project. It is not affiliated with, endorsed by, or sponsored by Anthropic.
ZynOps originated from the Claude Code codebase and has since been substantially modified to support multiple providers and open use. "Claude" and "Claude Code" are trademarks of Anthropic PBC. See LICENSE for details.
📄 License
Released under the MIT License.
Made with ☕ by the ZynOps community
