clawapi
v1.1.2
Published
An unofficial OpenAI-compatible API gateway and CLI tool powered by Playwright browser automation.
Maintainers
Readme
ClawAPI
Browser-based OpenAI-compatible AI Gateway — in Node.js
Run AI models like Claude through an automated, headless Playwright session. No API keys needed! A 1:1 cross-platform NodeJS rewrite of the GhostAPI python gateway.
Features
- Provides an OpenAI-compatible HTTP API
POST /v1/chat/completionsdirectly from your local browser. - Native HTTP Engine: Optimized for Claude to bypass 403 blocks with real browser fingerprinting.
- Web Dashboard: Stunning modern UI built with Glassmorphism to manage your providers visually.
- Session Portability: Robust Export/Import system with built-in ZIP compression for moving sessions between machines/VPS.
- Cross-platform: Runs seamlessly as a background service on Windows, Mac, and Linux.
- Colorful CLI: Native custom CLI with real-time status and log tailing.
Quick Start
1. Install via NPM
npm install -g clawapi2. Launch the Web Dashboard
clawapi ui startOpen http://localhost:3001 to manage your sessions visually!
3. CLI Usage (Optional)
ClawAPI manages an internal registry of LLM providers. (This commands installs ClawAPI and natively runs a postinstall hook downloading Playwright's Chromium binary needed for web automation.)
Usage
1. Install and Authenticate an AI Provider
ClawAPI manages an internal registry of LLM providers that are web-scraped.
# Add the Claude provider to your registry
clawapi add claude
# Authenticate Claude
clawapi auth claudeA browser window will open. Simply log in to Claude with your Google account or email. Please wait for the web chat UI to fully load. Once loaded, simply close the browser window. The CLI will save your session!
2. Start the Server
Start ClawAPI in the background as a detached REST API server:
clawapi start
# Or start on a custom port
clawapi start --port 8080Note: Due to the background headless detachment, no terminal window will block your screen!
3. Send OpenAI-compatible HTTP Requests
Now that the server is active, you can interact with Claude.ai (or any installed provider) using exactly the same schema as OpenAI's official chat/completions specifications!
curl -X POST http://localhost:8855/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "clawapi/claude",
"messages": [
{
"role": "user",
"content": "Tell me a short joke"
}
]
}'{
"id": "chatcmpl-df8a...699",
"object": "chat.completion",
"model": "clawapi/claude",
"choices": [{"message": {"role": "assistant", "content": "Because they make up everything! 😄"}}]
}CLI Commands Reference
clawapi list: View all authenticated modelsclawapi available: View the registry of all available LLMsclawapi status: View the port and active/background server statusesclawapi stop: Halt the detached HTTP server daemonclawapi restart: Safely restage the serverclawapi logs: Recursively tail the background error/activity streams
Example (Connecting Picoclaw)
ClawAPI acts identically to OpenAI infrastructure! You can plug it into clients native to ChatGPT effortlessly:
clawapi picoclaw{
"api_key": "sk-clawapi",
"api_base": "http://localhost:8855/v1",
"model": "clawapi/claude"
}