@aman_asmuei/achannel
v0.3.1
Published
The portable channel layer for AI companions — Telegram, Discord, WhatsApp, webhooks
Downloads
467
Maintainers
Readme
The portable channel layer for AI companions.
Connect your AI identity to Telegram, Discord, WhatsApp, and webhooks — with full ecosystem context in every message.
Quick Start · Channels · LLM Providers · Deploy · Ecosystem
The Problem
Your AI companion lives in a terminal. But you're on Telegram, Discord, and other platforms all day. There's no bridge between your ecosystem and the platforms where you actually communicate.
The Solution
achannel connects your AI identity to messaging platforms. Full ecosystem context — identity, tools, workflows, rules, skills — in every response.
npx @aman_asmuei/achannel add telegram
npx @aman_asmuei/achannel serveSame AI. Same personality. Same rules. Any platform.
Quick Start
# 1. Set up your AI identity first
npx @aman_asmuei/aman
# 2. Add a channel
npx @aman_asmuei/achannel add telegram
# 3. Start serving
npx @aman_asmuei/achannel serveHow It Works
Browser ──┐
Telegram ──┤
Discord ──┼──> achannel ──> LLM (with full ecosystem context)
Webhook ──┘- A message arrives on any channel
- achannel loads the ecosystem (identity, tools, workflows, guardrails, skills)
- Sends the message + ecosystem context to your chosen LLM
- Returns the response through the same channel
Commands
| Command | What it does |
|:--------|:-------------|
| achannel add <channel> | Set up a channel (telegram, discord, whatsapp, webhook) |
| achannel remove <channel> | Remove a channel |
| achannel list | List configured channels |
| achannel serve | Start all channels |
| achannel doctor | Health check |
Web UI (New in v0.3.0)
achannel serve now includes a full web dashboard at http://localhost:3000:
- Chat with SSE streaming responses
- Plans — view active plans with progress bars
- Profiles — see available agent profiles
- Teams — view agent teams
- Memory — search your AI's memory
- Settings — provider, model, ecosystem status
Works on phone, tablet, and desktop. Dark and light theme.
achannel serve # → open http://localhost:3000
# Or via Docker
docker compose up -d # → same URLREST API endpoints:
| Endpoint | Description |
|:---|:---|
| GET /chat/stream | SSE streaming chat |
| GET /api/status | Ecosystem status |
| GET /api/plans | Plan list |
| GET /api/profiles | Available profiles |
| GET /api/teams | Team list |
| GET /api/memory?q= | Memory search |
| POST /chat | JSON chat |
| DELETE /chat | Clear session |
Channels
Set up a bot via @BotFather, get the token, then:
achannel add telegramFeatures:
- Personal mode (only responds to you) or public mode
- Conversation memory (last 20 messages per chat)
/clearto reset conversation- Auto-splits long messages (4000 char limit)
Create a bot at Discord Developer Portal, get the token, then:
achannel add discordFeatures:
- Responds when @mentioned or in DMs
- Per-channel conversation memory
- Auto-splits long messages (1900 char limit)
HTTP endpoint for custom integrations:
achannel add webhookEndpoints:
| Method | Path | Description |
|:-------|:-----|:------------|
| GET | /status | Health check |
| POST | /chat | Send a message |
curl -X POST http://localhost:3000/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hello!"}'LLM Providers
| Provider | Setup | Cost | |:---------|:------|:-----| | Anthropic (Claude) | API key from console.anthropic.com | Pay per token | | OpenAI (GPT) | API key from platform.openai.com | Pay per token | | Ollama (local) | Install Ollama, pull a model | Free |
Deploy on Raspberry Pi
Run your AI companion locally, fully offline:
# 1. Install Node.js and Ollama on your Pi
# 2. Pull a small model
ollama pull llama3.2
# 3. Set up your identity
npx @aman_asmuei/aman
# 4. Add Telegram and choose Ollama as provider
npx @aman_asmuei/achannel add telegram
# 5. Start
npx @aman_asmuei/achannel serveYour AI is now on Telegram, running locally, fully offline.
Configuration
All config lives in ~/.achannel/:
| File | Purpose |
|:-----|:--------|
| config.json | Channel credentials and settings |
| channels.md | Human-readable summary of configured channels |
The Ecosystem
aman
├── acore → identity → who your AI IS
├── amem → memory → what your AI KNOWS
├── akit → tools → what your AI CAN DO
├── aflow → workflows → HOW your AI works
├── arules → guardrails → what your AI WON'T do
├── askill → skills → what your AI MASTERS
├── aeval → evaluation → how GOOD your AI is
└── achannel → channels → WHERE your AI lives ← YOU ARE HERE| Layer | Package | What it does | |:------|:--------|:-------------| | Identity | acore | Personality, values, relationship memory | | Memory | amem | Automated knowledge storage (MCP) | | Tools | akit | 15 portable AI tools (MCP + manual fallback) | | Workflows | aflow | Reusable AI workflows | | Guardrails | arules | Safety boundaries and permissions | | Skills | askill | Domain expertise | | Evaluation | aeval | Relationship tracking | | Unified | aman | One command to set up everything |
Contributing
Contributions welcome! Add channel adapters, improve platform support, or suggest features.
License
One identity. Every platform. Your AI everywhere.
