lottobot-server
v1.0.0
Published
LottoBot local proxy — connect your AI chat app to LottoBot predictions
Downloads
85
Readme
lottobot-server
Local proxy for LottoBot — connects your AI chat app to LottoBot's prediction engine.
Exposes an Ollama-compatible API on localhost:11434 so you can use LottoBot from any app that supports Ollama or the OpenAI API format.
Requirements
- Node.js 18+ (nodejs.org)
- A LottoBot API key (lottobot.ai/dashboard)
Usage
npx lottobot-server --key lb_YOUR_API_KEYThen open your AI chat app and connect to http://localhost:11434. Select lottobot as the model.
Compatible Apps
| App | Connection |
|---|---|
| Open WebUI | Ollama URL: http://localhost:11434 |
| AnythingLLM | Ollama endpoint: http://localhost:11434 |
| LM Studio | OpenAI base URL: http://localhost:11434/v1 |
| Continue.dev | Ollama provider: http://localhost:11434 |
Commands in Chat
generate— get 6 prediction setsgenerate 8 sets— get 8 prediction setsgive me lucky numbers— same as generate
Each prediction uses one draw from your API key.
Platform Support
Works on Mac, Linux, and Windows (Command Prompt or PowerShell).
