@toolkit-cli/tk-helper
v0.2.1
Published
Local bridge for Toolkit-LLM — connects your ChatGPT subscription to the web app
Maintainers
Readme
tk_ helper
Use your own AI subscriptions through Toolkit-LLM — one CLI, two superpowers.
npx @toolkit-cli/tk-helperWhy tk-helper?
You're paying for AI twice. You have a ChatGPT subscription AND you're paying per-token for Claude Code. tk-helper fixes both:
| Without tk-helper | With tk-helper | |---|---| | Claude Code → Anthropic API ($$$) | Claude Code → Toolkit self-hosted models (your API key) | | ChatGPT subscription sits unused in browser | ChatGPT subscription powers Toolkit's web app | | Two separate AI products | One unified system |
Quick Start
npx @toolkit-cli/tk-helperFirst run opens the setup wizard:
╔════════════════════════════════════════╗
║ ║
║ tk_ helper ║
║ v0.2.0 ║
║ ║
║ Connect Claude Code & ChatGPT to ║
║ Toolkit-LLM's self-hosted models ║
║ ║
╚════════════════════════════════════════╝
Step 1: Toolkit-LLM API Key
Get your key at https://toolkit-llm.com/dashboard/keys
API Key (tk_live_...): ████████
Validating... ✓ Valid
Step 2: Configure Claude Code
Map Claude Code → Toolkit-LLM? (Y/n): Y
✓ Claude Code configured
Step 3: Connect ChatGPT (optional)
Connect ChatGPT? (y/N): y
Opening browser for OpenAI authentication...
✓ ChatGPT connected
✓ Setup complete!Use with Claude Code
Route Claude Code through Toolkit-LLM's self-hosted models. Same Claude Code UX, different backend.
Setup
npx @toolkit-cli/tk-helper initEnter your API key. Say yes to "Configure Claude Code". Done.
What happens
tk-helper writes ~/.claude/settings.json:
{
"env": {
"ANTHROPIC_AUTH_TOKEN": "tk_live_YOUR_KEY",
"ANTHROPIC_BASE_URL": "https://api.toolkit-llm.com/v1",
"ANTHROPIC_DEFAULT_OPUS_MODEL": "toolkit-code-backend",
"ANTHROPIC_DEFAULT_SONNET_MODEL": "toolkit-base",
"ANTHROPIC_DEFAULT_HAIKU_MODEL": "toolkit-code-fast"
}
}Model mapping
| When Claude Code asks for | Toolkit sends it to | What it is | |---|---|---| | Opus (heavy tasks) | toolkit-code-backend | 27B dense, architecture & APIs | | Sonnet (default) | toolkit-base | 30B GLM, business & analysis | | Haiku (fast tasks) | toolkit-code-fast | 9B, quick edits & autocomplete |
Use it
claude "build me a landing page"That's it. Claude Code works exactly the same — it just uses Toolkit's models.
Change model profile
npx @toolkit-cli/tk-helper
# Select "4. Select Model" from the menuUndo it
npx @toolkit-cli/tk-helper uninstallRemoves only Toolkit's env vars from Claude Code. Everything else untouched.
Use with ChatGPT Subscription
Already paying $20/mo for ChatGPT Plus? Use that subscription through Toolkit's web app.
Connect
npx @toolkit-cli/tk-helper connectOpens your browser → sign into ChatGPT → authorize → done.
Start the bridge
npx @toolkit-cli/tk-helper startRuns a local server on localhost:1455. The web app at toolkit-llm.com auto-detects it.
How it works
toolkit-llm.com → detects localhost:1455 → routes through your ChatGPT token → GPT-5.4You get GPT-5.4 in Toolkit's workspace at your flat subscription rate. No per-token billing.
Disconnect
npx @toolkit-cli/tk-helper disconnectAll Commands
tk-helper Interactive menu (wizard on first run)
tk-helper init Re-run setup wizard
tk-helper start Start bridge server (localhost:1455)
tk-helper connect Authenticate with ChatGPT
tk-helper disconnect Clear ChatGPT tokens
tk-helper claude-code Configure Claude Code → Toolkit-LLM
tk-helper uninstall Remove from Claude Code + clear tokens
tk-helper doctor Health check
tk-helper status Show current config
tk-helper help Show helpHealth Check
npx @toolkit-cli/tk-helper doctor Doctor — Health Check
1. API Key............. ✓ Valid
2. Claude Code......... ✓ Installed
3. Claude Settings..... ✓ Toolkit configured
4. ChatGPT OAuth....... ✓ Connected
5. Bridge Server....... ○ Not running
Result: 4 passed, 0 issuesFiles
| Path | What | Permissions |
|---|---|---|
| ~/.tk-helper/config.json | API key + model | 0600 |
| ~/.tk-helper/tokens.json | ChatGPT OAuth tokens | 0600 |
| ~/.claude/settings.json | Claude Code env vars | Default |
Full cleanup
npx @toolkit-cli/tk-helper uninstall
rm -rf ~/.tk-helperModels
| Name | ID | Best for | Context |
|---|---|---|---|
| Default | toolkit-chat | Fast everyday chat | 32K |
| Pro Chat | toolkit-chat-pro | Deeper reasoning | 32K |
| Assistant | toolkit-base | Business & analysis | 200K |
| Builder | toolkit-code | Full-stack coding | 131K |
| Backend | toolkit-code-backend | Architecture & APIs | 65K |
| Voice | toolkit-voice | Real-time speech | 512 |
| Camera | toolkit-cam | Visual understanding | 1K |
API
Toolkit-LLM is OpenAI-compatible. Works with any SDK:
from openai import OpenAI
client = OpenAI(
api_key="tk_live_YOUR_KEY",
base_url="https://api.toolkit-llm.com/v1"
)
response = client.chat.completions.create(
model="toolkit-code",
messages=[{"role": "user", "content": "Hello"}]
)curl https://api.toolkit-llm.com/v1/chat/completions \
-H "Authorization: Bearer tk_live_YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{"model":"toolkit-code","messages":[{"role":"user","content":"Hello"}],"stream":true}'toolkit-llm.com · Quickstart · Pricing · Models · MIT License
