@didim365/agent-cli
v0.2.25
Published
Gemini CLI
Readme
Didim Agent CLI

Didim Agent CLI is an open-source AI agent that brings the power of multiple AI providers directly into your terminal. Built on the Gemini CLI foundation, it supports Gemini, Claude, OpenAI, and OpenAI-compatible (vLLM, Ollama, LM Studio) endpoints through a unified provider adapter architecture, giving you the most direct path from your prompt to your preferred model.
Learn all about Didim Agent CLI in our documentation.
🚀 Why Didim Agent CLI?
- 🧠 Multi-provider support: Use Gemini, Claude, OpenAI, or local models
(vLLM/Ollama) — switch providers and models with
/modelor/auth login. - 🔧 Built-in tools: Google Search grounding, file operations, shell commands, web fetching — all tools work across providers.
- 🔌 Extensible: MCP (Model Context Protocol) support with deterministic tool naming and sLM-compatible parameter normalization.
- 🤖 Sub-agent support: Sub-agents work with all providers via the
provider-independent
llm*pipeline. - 💻 Terminal-first: Designed for developers who live in the command line.
- 🛡️ Open source: Apache 2.0 licensed.
📦 Installation
Pre-requisites before installation
- Node.js version 20 or higher
- macOS, Linux, or Windows
Quick Install
Run instantly with npx
# Using npx (no installation required)
npx @didim365/agent-cliInstall globally with npm
npm install -g @didim365/agent-cliInstall globally with Homebrew (macOS/Linux)
brew install gemini-cliInstall globally with MacPorts (macOS)
sudo port install gemini-cliInstall with Anaconda (for restricted environments)
# Create and activate a new environment
conda create -y -n gemini_env -c conda-forge nodejs
conda activate gemini_env
# Install Gemini CLI globally via npm (inside the environment)
npm install -g @didim365/agent-cliRelease Cadence and Tags
See Releases for more details.
Preview
New preview releases will be published each week at UTC 2359 on Tuesdays. These
releases will not have been fully vetted and may contain regressions or other
outstanding issues. Please help us test and install with preview tag.
npm install -g @didim365/agent-cli@previewStable
- New stable releases will be published each week at UTC 2000 on Tuesdays, this
will be the full promotion of last week's
previewrelease + any bug fixes and validations. Uselatesttag.
npm install -g @didim365/agent-cli@latestNightly
- New releases will be published each day at UTC 0000. This will be all changes
from the main branch as represented at time of release. It should be assumed
there are pending validations and issues. Use
nightlytag.
npm install -g @didim365/agent-cli@nightly📋 Key Features
Code Understanding & Generation
- Query and edit large codebases
- Generate new apps from PDFs, images, or sketches using multimodal capabilities
- Debug issues and troubleshoot with natural language
Automation & Integration
- Automate operational tasks like querying pull requests or handling complex rebases
- Use MCP servers to connect new capabilities, including media generation with Imagen, Veo or Lyria
- Run non-interactively in scripts for workflow automation
Advanced Capabilities
- Ground your queries with built-in Google Search for real-time information
- Conversation checkpointing to save and resume complex sessions
- Custom context files (AGENTS.md) to tailor behavior for your projects
GitHub Integration
Integrate Gemini CLI directly into your GitHub workflows with Gemini CLI GitHub Action:
- Pull Request Reviews: Automated code review with contextual feedback and suggestions
- Issue Triage: Automated labeling and prioritization of GitHub issues based on content analysis
- On-demand Assistance: Mention
@gemini-cliin issues and pull requests for help with debugging, explanations, or task delegation - Custom Workflows: Build automated, scheduled and on-demand workflows tailored to your team's needs
🔐 Authentication Options
Choose the authentication method that best fits your needs. You can also use
/auth login inside the CLI to interactively select a provider and enter your
API key.
Note: Both
DIDIM_*andGEMINI_*environment variable prefixes are supported. The CLI uses a centralresolveEnv()utility that checksDIDIM_*first, then falls back toGEMINI_*for backward compatibility.
Option 1: Login with Google (Gemini)
✨ Best for: Individual developers and Gemini Code Assist license holders.
didim
# Select "Login with Google" and follow the browser authentication flowFor organization accounts, set your Google Cloud project first:
export GOOGLE_CLOUD_PROJECT="YOUR_PROJECT_ID"
didimOption 2: Gemini API Key
✨ Best for: Developers who need specific Gemini model control.
export GEMINI_API_KEY="YOUR_API_KEY"
didimOption 3: Claude (Anthropic)
✨ Best for: Developers who prefer Claude models (Opus, Sonnet, Haiku).
export ANTHROPIC_API_KEY="YOUR_API_KEY"
didimOption 4: OpenAI
✨ Best for: Developers who prefer OpenAI models (GPT-4.1, o3, o4-mini).
export OPENAI_API_KEY="YOUR_API_KEY"
didimOption 5: Vertex AI
✨ Best for: Enterprise teams and production workloads.
export GOOGLE_API_KEY="YOUR_API_KEY"
export GOOGLE_GENAI_USE_VERTEXAI=true
didimOption 6: OpenAI-compatible (vLLM, Ollama, LM Studio)
✨ Best for: Local/self-hosted models and privacy-sensitive environments.
export ENABLE_MULTI_PROVIDER=true
export LLM_PROVIDER=openai-compatible
export LLM_BASE_URL="http://localhost:8000/v1"
didimFor detailed setup for each provider, see the authentication guide and provider guide.
🚀 Getting Started
Basic Usage
Start in current directory
didimInclude multiple directories
didim --include-directories ../lib,../docsUse specific model
didim -m gemini-2.5-flash # Gemini
didim -m claude-sonnet-4-5-20250929 # Claude
didim -m gpt-4.1 # OpenAINon-interactive mode for scripts
Get a simple text response:
didim -p "Explain the architecture of this codebase"For more advanced scripting, including how to parse JSON and handle errors, use
the --output-format json flag to get structured output:
didim -p "Explain the architecture of this codebase" --output-format jsonFor real-time event streaming (useful for monitoring long-running operations),
use --output-format stream-json to get newline-delimited JSON events:
didim -p "Run tests and deploy" --output-format stream-jsonQuick Examples
Start a new project
cd new-project/
didim
> Write me a Discord bot that answers questions using a FAQ.md file I will provideAnalyze existing code
git clone https://github.com/user/project
cd project
didim
> Give me a summary of all of the changes that went in yesterday📚 Documentation
Getting Started
- Quickstart Guide - Get up and running quickly.
- Authentication Setup - Detailed auth configuration.
- Configuration Guide - Settings and customization.
- Keyboard Shortcuts - Productivity tips.
Core Features
- Commands Reference - All slash commands
(
/help,/chat, etc). - Custom Commands - Create your own reusable commands.
- Context Files (AGENTS.md) - Provide persistent context to the CLI.
- Checkpointing - Save and resume conversations.
- Token Caching - Optimize token usage.
Tools & Extensions
- Built-in Tools Overview
- MCP Server Integration - Extend with custom tools.
- Custom Extensions - Build and share your own commands.
Advanced Topics
- Headless Mode (Scripting) - Use Gemini CLI in automated workflows.
- Provider Guide - Multi-provider runtime usage
(
gemini,claude,openai,openai-compatibleincluding vLLM). - Multi-Provider Configuration - Environment variables and precedence for provider/model resolution.
- Migration Guide - Move from Gemini-only to
provider-independent (
llm*) call paths. - Provider Adapter API - Adapter contract and streaming/event types overview.
- Architecture Overview - How Gemini CLI works.
- IDE Integration - VS Code companion.
- Sandboxing & Security - Safe execution environments.
- Trusted Folders - Control execution policies by folder.
- Enterprise Guide - Deploy and manage in a corporate environment.
- Telemetry & Monitoring - Usage tracking.
- Tools API Development - Create custom tools.
- Local development - Local development tooling.
vLLM Quick Start
export ENABLE_MULTI_PROVIDER=true
export LLM_PROVIDER=openai-compatible
export LLM_BASE_URL="http://localhost:8000/v1"
export LLM_MODEL="Qwen/Qwen2.5-7B-Instruct"
didim -m Qwen/Qwen2.5-7B-InstructTroubleshooting & Support
- Troubleshooting Guide - Common issues and solutions.
- FAQ - Frequently asked questions.
- Use
/bugcommand to report issues directly from the CLI.
Using MCP Servers
Configure MCP servers in ~/.didim/settings.json (or ~/.gemini/settings.json
for backward compatibility) to extend the CLI with custom tools:
> @github List my open pull requests
> @slack Send a summary of today's commits to #dev channel
> @database Run a query to find inactive usersMCP tool naming is deterministic — tools are registered with consistent names regardless of server discovery order. Tool parameters are automatically normalized via schema-based coercion, with enhanced tolerance for sLM (small Language Model) tool call formatting.
See the MCP Server Integration guide for setup instructions.
🤝 Contributing
We welcome contributions! Gemini CLI is fully open source (Apache 2.0), and we encourage the community to:
- Report bugs and suggest features.
- Improve documentation.
- Submit code improvements.
- Share your MCP servers and extensions.
See our Contributing Guide for development setup, coding standards, and how to submit pull requests.
Check our Official Roadmap for planned features and priorities.
📖 Resources
- Official Roadmap - See what's coming next.
- Changelog - See recent notable updates.
- NPM Package - Package registry.
- GitHub Issues - Report bugs or request features.
- Security Advisories - Security updates.
Uninstall
See the Uninstall Guide for removal instructions.
📄 Legal
- License: Apache License 2.0
- Terms of Service: Terms & Privacy
- Security: Security Policy
