eckra
v1.0.5
Published
AI-powered Git management CLI with multi-provider support (OpenAI, Anthropic, OpenRouter, Gemini, Ollama, LM Studio)
Maintainers
Readme
Eckra
💡 Overview
eckra is an interactive Git management tool designed for developers who value both speed and clarity. It integrates with multiple AI providers to analyze your staged changes and suggest context-aware commit messages, ensuring your project history remains professional and descriptive without the manual overhead.
🚀 Key Features
- 🤖 AI-Powered Suggestions: Automatically generates commit messages based on actual code diffs. Supports LM Studio, OpenAI, Anthropic, Ollama, OpenRouter, and Google Gemini.
- 📝 Select & Edit: Pick an AI suggestion and refine it instantly to match your specific needs.
- 🔍 Staged Diff Review: Inspect your changes in a beautiful, syntax-highlighted format directly before committing.
- 🎯 Interactive Dashboard: A comprehensive menu system for staging files, managing branches, stashing changes, and syncing with remotes.
- ⚡ Built for Speed: Zero-config required for standard Git operations. Fast, responsive, and intuitive.
📦 Installation
Install eckra globally using npm:
npm install -g eckra🛠 Usage
Just type eckra in any Git repository to launch the interactive dashboard:
eckraQuick Commands
Skip the menu and jump straight into action:
| Command | Action |
| :------------- | :-------------------------------------------- |
| eckra commit | Start the AI-assisted commit flow |
| eckra status | Check repository status and staged files |
| eckra push | Sync local commits with the remote repository |
| eckra branch | Open the interactive branch manager |
⚙️ AI Configuration
eckra supports multiple AI providers. You can switch between them using the built-in settings menu (Settings > Change Provider).
Supported Providers
| Provider | Type | Default Model |
| :---------------- | :-------------- | :--------------------------- |
| LM Studio | Local | — (user-configured) |
| Ollama | Local | llama3 |
| OpenAI | Cloud (API Key) | gpt-4o |
| Anthropic | Cloud (API Key) | claude-3-5-sonnet-20240620 |
| OpenRouter | Cloud (API Key) | openai/gpt-4o |
| Google Gemini | Cloud (API Key) | gemini-2.0-flash |
Default Setup (LM Studio)
By default, eckra connects to LM Studio's local server:
- URL:
http://localhost:1234 - Requirement: Ensure LM Studio is running and the "Local Server" is started with a loaded model.
Configuration
You can configure your provider in two ways:
- Interactive: Run
eckra, go toMore > Settings, and select your provider and enter your credentials. - Config file: Edit
~/.eckra/config.jsondirectly:
{
"aiProvider": "openrouter",
"openrouterApiKey": "sk-or-...",
"openrouterModel": "anthropic/claude-3.5-sonnet"
}You can also create a .eckrarc file in your project root to override global settings per-repository.
🤝 Contributing
Contributions make the open-source community an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
Please see CONTRIBUTING.md for details on our code of conduct and the process for submitting pull requests.
📄 License
Distributed under the MIT License. See LICENSE for more information.
