usegenii
v1.0.5
Published
CLI and daemon for managing AI agents and communication channels
Downloads
24
Readme
Genii
An autonomous AI agent platform that runs in the background, maintaining persistent conversations across multiple channels (Telegram, Discord, etc.). A digital companion that lives on your machine, learns your preferences, and can be reached through various messaging platforms.
Features
- Persistent Agents: AI agents that maintain conversation history and context across sessions
- Multi-Channel Support: Connect to Telegram, Discord, and other messaging platforms
- Configurable Models: Support for Anthropic, OpenAI, and Google AI models
- Guidance System: Customize agent personality and behavior through markdown files
- Background Daemon: Runs quietly in the background, always available
- CLI Control: Full control over agents, channels, and configuration via command line
Quick Start
1. Install Dependencies
pnpm install2. Configure Providers and Models
Create the configuration directory:
mkdir -p ~/.config/genii/guidanceCreate ~/.config/genii/providers.toml:
[anthropic]
type = "anthropic"
base-url = "https://api.anthropic.com"
credential = "secret:anthropic-api-key"Create ~/.config/genii/models.toml:
[sonnet]
provider = "anthropic"
model-id = "claude-sonnet-4-20250514"
thinking-level = "low"
[opus]
provider = "anthropic"
model-id = "claude-opus-4-5-20251101"
thinking-level = "medium"3. Store Your API Key
On macOS, store your API key in the system keychain:
security add-generic-password -s genii -a anthropic-api-key -w "sk-ant-your-key-here"On Linux, create a secrets file at ~/.config/genii/secrets.json:
{
"anthropic-api-key": "sk-ant-your-key-here"
}4. Create Minimal Guidance
Create ~/.config/genii/guidance/SOUL.md:
You are a helpful assistant.5. Start the Daemon
# Start in foreground for debugging
cd apps/daemon && pnpm tsx src/index.ts --log-level debug
# Or start via CLI (runs in background)
cd apps/cli && pnpm tsx bin/genii.ts daemon start6. Spawn an Agent
cd apps/cli && pnpm tsx bin/genii.ts agent spawn --model anthropic/sonnet "Hello, world!"7. List Agents
cd apps/cli && pnpm tsx bin/genii.ts agent listProject Structure
genii/
├── apps/
│ ├── cli/ # @genii/cli - Command-line interface
│ ├── daemon/ # @genii/daemon - Background daemon process
│ └── desktop/ # Tauri desktop application (WIP)
└── shared/
├── comms/ # @genii/comms - Communication channels
├── config/ # @genii/config - Configuration and secrets
├── guidance/ # @genii/guidance - Template files
├── lib/ # @genii/lib - Shared utilities
├── models/ # @genii/models - Model factory
└── orchestrator/ # @genii/orchestrator - Agent orchestrationConfiguration
Model Identifiers
Models are referenced using the format provider/model-name. For example:
anthropic/sonnet- References thesonnetmodel configured under theanthropicprovideranthropic/opus- References theopusmodel configured under theanthropicprovider
Configuration Files
All configuration files are stored in ~/.config/genii/ (Linux/macOS) or %APPDATA%/genii/ (Windows).
| File | Description |
|------|-------------|
| providers.toml | Provider configurations (API endpoints, credentials) |
| models.toml | Model configurations (provider reference, model ID, thinking level) |
| channels.toml | Communication channel configurations |
| preferences.toml | User preferences |
| guidance/SOUL.md | Default agent personality/instructions |
Thinking Levels
For Anthropic models, you can configure the thinking level:
off- No extended thinkingminimal- Minimal thinkinglow- Low thinking budgetmedium- Medium thinking budget (default for Anthropic)high- High thinking budget
OpenAI and Google models only support off.
Prerequisites
- pnpm - Package manager
- Turbo - Build system
- Node.js - Runtime (v20+ recommended)
- Rust - For Tauri desktop app (optional)
Installation
# Install dependencies for all packages
pnpm installDevelopment Commands
All commands should be run from the root using pnpm or turbo.
Run all checks (linting + formatting)
pnpm check
# or
turbo run checkAuto-fix linting and formatting issues
pnpm check:fix
# or
turbo run check:fixDevelopment mode
Starts development servers for all packages that support it:
pnpm dev
# or
turbo run devBuild all packages
pnpm build
# or
turbo run buildCode Quality
This project uses Biome for:
- Linting
- Formatting
- Import organization
Configuration is in biome.json at the root. All packages use this single config.
Publishing to npm
Prerequisites
- npm account: Create an account at https://www.npmjs.com
- npm org: Create the
@geniiorganization at https://www.npmjs.com/org/create - Login: Run
npm loginto authenticate
Package Overview
| Package | Description |
|---------|-------------|
| usegenii | Meta-package (installs CLI + daemon) |
| @genii/cli | CLI binary (genii command) |
| @genii/daemon | Daemon binary (genii-daemon command) |
| @genii/config | Configuration management |
| @genii/models | Model factory |
| @genii/orchestrator | Agent orchestration |
| @genii/comms | Messaging adapters |
| @genii/guidance | Template files |
| @genii/lib | Shared utilities |
Publish All Packages
The scripts/publish-all.sh script automates publishing all packages in the correct dependency order. It:
- Reads the version from the root
package.jsonand syncs it to all nested packages - Runs pre-publish checks (build, lint, test)
- Publishes shared packages first (lib, config, comms, orchestrator, guidance)
- Publishes models (depends on config + orchestrator)
- Publishes apps (cli, daemon)
- Publishes the root meta-package (auto-converts
workspace:*to version numbers)
# Dry run (test without publishing)
pnpm publish:dry-run
# Publish for real
pnpm publish:all
# Skip pre-publish checks (build, lint, test)
./scripts/publish-all.sh --skip-checksVersion Management
All packages share the same version. To release a new version:
- Update the version in the root
package.json - Run
pnpm publish:all(the script auto-syncs the version to all nested packages)
