@mindstudio-ai/local-model-tunnel
v0.1.5
Published
Run local AI models with MindStudio
Maintainers
Readme
MindStudio Local Model Tunnel
Run local models with MindStudio.
Providers supported so far:
Text Generation
Image Generation
Prerequisites
- Node.js 18+
Installation
npm install -g @mindstudio-ai/local-model-tunnelQuick Start
# Launch the interactive menu
mindstudio-localThis opens an interactive home screen where you can:
- Setup - Install and configure local AI providers (Ollama, LM Studio, Stable Diffusion)
- Authenticate - Log in to MindStudio
- Register Models - Register your local models with MindStudio
- Start Tunnel - Launch the local model tunnel
- View Models - See available local models
- Configuration - View current settings
Manual Commands
If you prefer command-line usage:
# Run the setup wizard
mindstudio-local setup
# Authenticate with MindStudio
mindstudio-local auth
# Register your local models
mindstudio-local register
# Start the tunnel
mindstudio-local startSetup Wizard
The setup wizard (mindstudio-local setup) helps you install and configure providers:
Ollama:
- Auto-install Ollama (Linux/macOS)
- Start/stop Ollama server
- Download models from ollama.com/library
LM Studio:
- Opens download page in browser
- Guides you through enabling the local server
Stable Diffusion Forge:
- Clones the repository to your chosen location
- Provides setup instructions
- Tip: Download models from civitai.com (filter by "SDXL 1.0")
Provider Setup (Manual)
Ollama
- Download Ollama
- Pull a model:
ollama pull llama3.2(see all models) - Start the server:
ollama serve
LM Studio
- Download LM Studio
- Download a model through the app
- Enable the Local Server
Stable Diffusion (Forge)
First-time setup:
git clone https://github.com/lllyasviel/stable-diffusion-webui-forge.git
cd stable-diffusion-webui-forge
./webui.sh --apiSubsequent runs:
cd stable-diffusion-webui-forge
./webui.sh --apiCommands
| Command | Description |
| ------------ | ----------------------------------------- |
| (none) | Open interactive home screen |
| setup | Interactive setup wizard for providers |
| auth | Authenticate with MindStudio via browser |
| register | Register all local models with MindStudio |
| start | Start the local model tunnel |
| models | List available local models |
| status | Check connection status |
| config | Show current configuration |
| set-config | Set configuration |
| logout | Remove stored credentials |
Configuration Options
# Use custom provider URLs
mindstudio-local set-config --ollama-url http://localhost:11434
mindstudio-local set-config --lmstudio-url http://localhost:1234/v1
mindstudio-local set-config --sd-url http://127.0.0.1:7860How It Works
- Authenticates with your MindStudio account
- Discovers your local models
- Polls MindStudio for inference requests
- Routes requests to local server and streams responses back
License
MIT
