lemonade-interactive-loader
v1.0.4
Published
Interactive CLI tool to launch Lemonade Server with custom arguments and download llama.cpp releases - Cross-platform (Windows/Linux)
Maintainers
Readme
🍋 Lemonade Interactive Loader
The easiest way to manage llama.cpp builds and run Lemonade Server
Lemonade Interactive Loader is a professional, cross-platform CLI tool that simplifies downloading llama.cpp builds and launching Lemonade Server with an intuitive interactive interface.
🚀 Quick Start
Installation
Option 1: Install globally via npm
npm install -g lemonade-interactive-loaderOption 2: Run without installing (via npx)
npx lemonade-interactive-loaderOption 3: Install from source
git clone https://github.com/yourusername/lemonade-interactive-loader.git
cd lemonade-interactive-loader
npm installRunning the Tool
# If installed globally
lemonade-loader
# Or via npx (no installation needed)
npx lemonade-interactive-loader
# From source
npm start
# or
node index.jsThat's it! You'll be presented with a friendly menu to configure and run your server.
🎯 What You Can Do
Interactive Menu Options
The main menu adapts based on whether you have a configuration saved:
When NO configuration exists:
? What would you like to do?
❯ 🚀 Setup - Configure Lemonade Server
📦 Download Custom llama.cpp BuildsWhen configuration EXISTS:
? What would you like to do?
❯ ▶️ Start Server with Current Config
✏️ Edit Configuration
👁️ View Configuration
🔄 Reset Configuration
──────────────────────────────────────
🚀 Setup - Configure Lemonade Server
📦 Download Custom llama.cpp Builds| Command | Description | |---------|-------------| | ▶️ Start Server | Launch Lemonade Server with saved config (when config exists) | | ✏️ Edit Configuration | Update your saved settings interactively | | 👁️ View Configuration | See your current configuration and installed builds | | 🔄 Reset Configuration | Start fresh by resetting all settings | | 🚀 Setup | Run the 8-question setup wizard to configure everything | | 📦 Download Builds | View, delete, or download custom llama.cpp builds |
The Setup Wizard
Just answer 9 simple questions:
- Network access? Should the server be accessible from other devices?
- Port number? Which port should it run on? (default: 8080)
- Logging level? Choose from info, debug, warning, or error
- Context window size? Choose from 4K, 8K, 16K, 32K, 64K, 128K, or 256K tokens (default: 4K)
- Model directory? Point to existing models (like LM Studio) if needed
- Interface type? System tray or headless mode
- Custom arguments? Any additional llama.cpp parameters?
- Custom build? Use a specific llama.cpp build from GitHub?
- Backend? Choose auto, vulkan, rocm, or cpu
✨ Key Features
- 🎨 User-Friendly Interface - Interactive menus, no command-line expertise needed
- 💾 Smart Configuration - Save settings once, use them forever
- 📦 Build Management - Browse, download, and manage multiple llama.cpp builds
- 🖥️ Backend Flexibility - Support for CPU, CUDA, ROCm, Vulkan, and more
- 🌐 Network Ready - Easily configure localhost or network access
- 🔄 Cross-Platform - Works seamlessly on Windows, Linux, and macOS
- 🎯 Auto-Detection - Automatically suggests the best builds for your system
- ⚡ Quick Launch - Start your server with a single command
📖 Documentation
- 📚 Usage Guide - Get started quickly
- 🔧 Technical Documentation - Deep dive into architecture and API
- 🛠️ Troubleshooting - Common issues and solutions
🎬 Usage Examples
First-Time Setup
$ node index.js
╔════════════════════════════════════════════════════════╗
║ 🍋 Lemonade Interactive Launcher ║
╚════════════════════════════════════════════════════════╝
⚠️ No configuration found. Please run Setup first.
? What would you like to do?
❯ 🚀 Setup - Configure Lemonade Server
📦 Download Custom llama.cpp BuildsAfter Configuration
$ node index.js
╔════════════════════════════════════════════════════════╗
║ 🍋 Lemonade Interactive Launcher ║
╚════════════════════════════════════════════════════════╝
? What would you like to do?
❯ ▶️ Start Server with Current Config
✏️ Edit Configuration
👁️ View Configuration
🔄 Reset Configuration
──────────────────────────────────────
🚀 Setup - Configure Lemonade Server
📦 Download Custom llama.cpp BuildsDownloading a Custom Build
- Select 📦 Download Custom llama.cpp Builds
- Choose ⬇️ Download new build
- Pick a release from the list
- Select the asset for your platform
- Sit back while it downloads and extracts automatically
Running with Custom Models
Point Lemonade Server to your existing model directory (like LM Studio's):
? Is there another model directory to use? (example: LM Studio) Yes
? Enter the model directory path: /home/user/.local/share/lmstudio/models🛠️ Configuration
Configuration is automatically saved and loaded:
- Location:
~/.lemonade-interactive-launcher/config.json(Linux/macOS) or%USERPROFILE%\.lemonade-interactive-launcher\config.json(Windows) - Format: JSON
- Auto-saved: After every setup or edit
Example Configuration
{
"exposeToNetwork": false,
"host": "127.0.0.1",
"port": 8080,
"logLevel": "info",
"backend": "auto",
"modelDir": "None",
"runMode": "headless",
"llamacppArgs": "",
"customLlamacppPath": ""
}🌍 Cross-Platform Support
Supported Systems
| Platform | Versions | Architecture | |----------|----------|--------------| | Windows | 10, 11 | x64, arm64 | | Linux | Ubuntu, Debian, CentOS, etc. | x64, arm64, armv7l | | macOS | 10.15+ (Catalina+) | x64, arm64 (Apple Silicon) |
Automatic Platform Detection
Lemonade Launcher automatically:
- Detects your operating system and architecture
- Suggests the best matching llama.cpp builds
- Uses the correct file paths and command syntax
- Handles platform-specific quirks
📦 Programmatic Usage
Use Lemonade Launcher as a library in your own projects:
const {
fetchAllReleases,
downloadAndExtractLlamaCpp,
loadConfig,
saveConfig
} = require('./src/index');
// Fetch available releases
const releases = await fetchAllReleases(10);
// Download a specific build
await downloadAndExtractLlamaCpp(asset, version);
// Manage configuration
const config = loadConfig();
config.port = 9090;
saveConfig(config);See the Technical Documentation for the full API reference.
🐛 Troubleshooting
Common Issues
"Command not found" or "npm: command not found"
Solution: Ensure Node.js and npm are installed and in your PATH.
Permission denied errors (Linux/macOS)
chmod +x index.jsPermission denied errors (Windows)
Run Command Prompt or PowerShell as Administrator.
Build download fails
- Check your internet connection
- Ensure you have write permissions to
~/.lemonade-interactive-launcher/ - Try downloading the asset manually from GitHub
Server won't start
- Verify lemonade-server is installed: lemonade-server.ai
- Check that the configuration is correct:
node index.js→ View Configuration - Try running with debug logging: Set log level to "debug" in setup
Getting Help
- Check the Technical Documentation
- Review the Troubleshooting Guide
- Open an issue on GitHub
🤝 Contributing
Contributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Test thoroughly on your platform
- Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Code of Conduct
- Be respectful and inclusive
- Follow the existing code style
- Write clear commit messages
- Document your changes
📚 Resources
- Technical Documentation - Architecture, API reference, and development guide
- llama.cpp - The underlying inference engine
- Lemonade Server - The server being launched
📄 License
This project is licensed under the ISC License. See the LICENSE file for details.
🙏 Acknowledgments
- llama.cpp team for the amazing inference engine
- Lemonade Server for the server implementation
- The Node.js community for the fantastic ecosystem
Made with 🍋 by Nxtmind
Happy prompting!
