ollama-export
v1.1.1
Published
A simple Node.js CLI utility to export Ollama models as .tar or tar streams. You can use this tool to archive and transport models managed by Ollama, especially when moving between systems or performing backups.
Maintainers
Readme
Ollama Export CLI Tool
A simple Node.js CLI utility to export Ollama models as .tar or tar streams.
You can use this tool to archive and transport models managed by Ollama, especially when moving between systems or performing backups.
Note: This tool works with Ollama's internal model storage. You must set the
OLLAMA_PATHenvironment variable to your.ollamadirectory path.
🔧 Features
- Exports any local Ollama model to a
.tarfile. - Supports streaming the
.tararchive tostdout. - Works on Linux, macOS (probably), and Windows.
- Can be called via two interchangeable commands:
ollama-exportoroexport.
🧩 Installation
Clone or download the repository, then install dependencies:
npm i -g ollama-export🌍 Environment Setup
You must set the OLLAMA_PATH environment variable to point to the .ollama directory on your system. This is where Ollama stores downloaded models.
Linux / macOS
If your .ollama folder is in your home directory:
export OLLAMA_PATH="$HOME/.ollama"You can add this to your shell configuration file (~/.bashrc, ~/.zshrc, etc.) to make it permanent.
Windows (CMD)
set OLLAMA_PATH=C:\Users\YourUsername\.ollamaWindows (PowerShell)
$env:OLLAMA_PATH = "C:\Users\YourUsername\.ollama"To make it permanent, add it to your system's environment variables:
- Open System Properties → Environment Variables
- Add a new User Variable named
OLLAMA_PATHwith the path to.ollama
🧪 Usage
Syntax
ollama-export <model-name:version> [-o]Or using the shorthand:
oexport <model-name:version> [-o]<model-name:version>– required, the model and its version. Example:mistral:latest-o– optional flag. If used, exports tostdoutas a tar stream. This must be the second argument. Order matters!
📦 Examples
1. Export model to file (default behavior)
This will create a mistral_latest.tar file in the current directory:
Linux / macOS
ollama-export mistral:latest Windows (CMD or PowerShell)
ollama-export mistral:latest 2. Pipe model tar to another process or output (stream mode)
This exports the tar archive directly to stdout.
Linux
ollama-export mistral:latest -o | tar -tvf -Sure! Here's the pipe to stdout usage example with zstd compression: (lol I forget to remove chatgpt prompt)
Exporting and Compressing with zstd via Pipe
Linux / macOS
ollama-export mistral:latest -o | zstd -o mistral_latest.tar.zst⚠️ The
-oflag must be the second argument. If the order is wrong, the tool will not work.
🧩 Notes
- Make sure the model exists in your local
.ollamadirectory. - If you mistype the argument order (e.g.,
-ofirst), the tool will not work. - You can use either
ollama-exportoroexport, both work identically.
✅ Example Output
Exporting llama2:7b to a file:
oexport llama2:7b 🛠 Troubleshooting
"Cannot find .ollama directory" error
Ensure OLLAMA_PATH is set correctly. You can test it:
Linux/macOS
echo $OLLAMA_PATHWindows (PowerShell)
echo $env:OLLAMA_PATHIf it doesn't point to the right folder, update it accordingly.
Potential Ollama Model Directory Paths:
Linux
Home directory (default)
If installed per user, Ollama’s model files are usually stored in a hidden directory inside the user’s home folder:~/.ollamaSystem-wide (root)
In case of a system-wide installation (for all users), it might be in:/opt/ollama/.ollama
macOS
Home directory (default)
Similar to Linux, the default path for individual user installations is:~/.ollamaHomebrew Installation
If you installed Ollama using Homebrew, the models might be found here:/usr/local/opt/ollama/.ollama
📥 How to Import
To import a model that was previously exported:
Extract the
.tarfile into your.ollamadirectory
(This should be the directory you set viaOLLAMA_PATH.)Linux / macOS:
tar -xf mistral_latest.tar -C "$OLLAMA_PATH"Windows (PowerShell):
tar -xf .\mistral_latest.tar -C $env:OLLAMA_PATHRestart the Ollama service to make it recognize the new model:
Linux:
systemctl restart ollamamacOS (Homebrew):
brew services restart ollamaWindows (if running as a service):
Restart via Services panel or with PowerShell:
Restart-Service -Name "Ollama"
❗ Model doesn't show up in ollama list?
In some cases, Ollama may not list the imported model right away. To fix this:
ollama pull mistral:latestIf the files are already present, Ollama will not re-download them — it will just verify their integrity and register the model locally.
📃 License
MIT License
Copyright (c) 2025 efeArdaYildirim
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.