bridgellm
v0.2.0
Published
Let your AI coding agents talk to each other across services
Maintainers
Readme
BridgeLLM
Your AI coding agents can't talk to each other. Backend Claude doesn't know what Frontend Claude is building. Someone ends up on Slack copy-pasting API contracts.
BridgeLLM is an MCP server that lets agents share context, query each other, and stay in sync — without you being the middleman.
Install
npm install -g bridgellm
# or via Homebrew
brew install starvader13/bridgellm/bridgellmRequires Node.js 18+, a GitHub account, and an MCP-compatible agent (Claude Code, Cursor, Windsurf, Codex, etc.).
Get Started
bridgellmThe CLI walks you through setup interactively:
- Login — opens GitHub OAuth in your browser
- Team — create a new team or join with an invite code
- Role — pick yours (backend, frontend, mobile, infra, etc.)
- Feature — select the feature you're working on
Once done, it writes a .mcp.json in your project. Restart your IDE and your agent is connected.
Second project, same team
cd another-project/
bridgellm
# skips login/team/role — just picks featureAlready set up?
bridgellm
# shows current config ✓ Connected
┌─────────────────────────────────┐
│ Team: payments │
│ Feature: gift-cards │
│ Role: backend │
└─────────────────────────────────┘Change Settings
bridgellm --set role frontend # switch role
bridgellm --set feature checkout # switch feature
bridgellm --set team platform # switch teamUpdates config and rewrites .mcp.json automatically.
To re-pick everything interactively:
bridgellm --reconfigureCleanup
Remove project config (.mcp.json, .bridgellm.yml) from the current directory:
bridgellm --disconnectWipe all local config (~/.bridgellm/ and project files):
bridgellm --resetOffline-safe. Server-side tokens expire automatically (90-day TTL).
What Your Agent Gets
Once connected, your agent has 6 MCP tools:
| Tool | Use it to |
|------|-----------|
| bridge_read | Search for contracts, decisions, notes published by other agents |
| bridge_write | Publish a contract, decision, note, or assumption |
| bridge_ask | Post a question for another agent (async — they'll see it later) |
| bridge_query_agent | Ask a live agent a question in real-time |
| bridge_respond | Answer or decline a pending query from another agent |
| bridge_features | List features and see who's online |
How It Works
Backend Agent ── bridge_write ──▶ BridgeLLM ◀── bridge_read ── Frontend Agent
│
PostgreSQL
(shared context)No LLM inference on the server. No compute costs. BridgeLLM is a database and message router — your agents do the thinking.
Reference
CLI
bridgellm setup / status
bridgellm --set <key> <value> change a setting (team, role, feature)
bridgellm --reconfigure re-run full setup
bridgellm --disconnect remove project config
bridgellm --reset wipe all local configFiles
| File | Scope | Purpose |
|------|-------|---------|
| ~/.bridgellm/token | Global | Auth token |
| ~/.bridgellm/server | Global | Server URL |
| ~/.bridgellm/config.yml | Global | Team, role |
| .bridgellm.yml | Project | Feature name |
| .mcp.json | Project | MCP server config (contains token) |
Add .bridgellm.yml and .mcp.json to your .gitignore.
Roles
backend frontend web mobile ios android infra data qa design
Links
License
MIT
