npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ollama-export

v1.1.1

Published

A simple Node.js CLI utility to export Ollama models as .tar or tar streams. You can use this tool to archive and transport models managed by Ollama, especially when moving between systems or performing backups.

Readme

Ollama Export CLI Tool

A simple Node.js CLI utility to export Ollama models as .tar or tar streams.
You can use this tool to archive and transport models managed by Ollama, especially when moving between systems or performing backups.

Note: This tool works with Ollama's internal model storage. You must set the OLLAMA_PATH environment variable to your .ollama directory path.


🔧 Features

  • Exports any local Ollama model to a .tar file.
  • Supports streaming the .tar archive to stdout.
  • Works on Linux, macOS (probably), and Windows.
  • Can be called via two interchangeable commands: ollama-export or oexport.

🧩 Installation

Clone or download the repository, then install dependencies:

npm i -g ollama-export

🌍 Environment Setup

You must set the OLLAMA_PATH environment variable to point to the .ollama directory on your system. This is where Ollama stores downloaded models.

Linux / macOS

If your .ollama folder is in your home directory:

export OLLAMA_PATH="$HOME/.ollama"

You can add this to your shell configuration file (~/.bashrc, ~/.zshrc, etc.) to make it permanent.

Windows (CMD)

set OLLAMA_PATH=C:\Users\YourUsername\.ollama

Windows (PowerShell)

$env:OLLAMA_PATH = "C:\Users\YourUsername\.ollama"

To make it permanent, add it to your system's environment variables:

  1. Open System PropertiesEnvironment Variables
  2. Add a new User Variable named OLLAMA_PATH with the path to .ollama

🧪 Usage

Syntax

ollama-export <model-name:version> [-o]

Or using the shorthand:

oexport <model-name:version> [-o]
  • <model-name:version> – required, the model and its version. Example: mistral:latest
  • -o – optional flag. If used, exports to stdout as a tar stream. This must be the second argument. Order matters!

📦 Examples

1. Export model to file (default behavior)

This will create a mistral_latest.tar file in the current directory:

Linux / macOS

ollama-export mistral:latest 

Windows (CMD or PowerShell)

ollama-export mistral:latest 

2. Pipe model tar to another process or output (stream mode)

This exports the tar archive directly to stdout.

Linux

ollama-export mistral:latest -o | tar -tvf -

Sure! Here's the pipe to stdout usage example with zstd compression: (lol I forget to remove chatgpt prompt)


Exporting and Compressing with zstd via Pipe

Linux / macOS

ollama-export mistral:latest -o | zstd -o mistral_latest.tar.zst

⚠️ The -o flag must be the second argument. If the order is wrong, the tool will not work.


🧩 Notes

  • Make sure the model exists in your local .ollama directory.
  • If you mistype the argument order (e.g., -o first), the tool will not work.
  • You can use either ollama-export or oexport, both work identically.

✅ Example Output

Exporting llama2:7b to a file:

oexport llama2:7b 

🛠 Troubleshooting

"Cannot find .ollama directory" error

Ensure OLLAMA_PATH is set correctly. You can test it:

Linux/macOS

echo $OLLAMA_PATH

Windows (PowerShell)

echo $env:OLLAMA_PATH

If it doesn't point to the right folder, update it accordingly.

Potential Ollama Model Directory Paths:

Linux

  1. Home directory (default)
    If installed per user, Ollama’s model files are usually stored in a hidden directory inside the user’s home folder:

    ~/.ollama
  2. System-wide (root)
    In case of a system-wide installation (for all users), it might be in:

    /opt/ollama/.ollama

macOS

  1. Home directory (default)
    Similar to Linux, the default path for individual user installations is:

    ~/.ollama
  2. Homebrew Installation
    If you installed Ollama using Homebrew, the models might be found here:

    /usr/local/opt/ollama/.ollama

📥 How to Import

To import a model that was previously exported:

  1. Extract the .tar file into your .ollama directory
    (This should be the directory you set via OLLAMA_PATH.)

    Linux / macOS:

    tar -xf mistral_latest.tar -C "$OLLAMA_PATH"

    Windows (PowerShell):

    tar -xf .\mistral_latest.tar -C $env:OLLAMA_PATH
  2. Restart the Ollama service to make it recognize the new model:

    Linux:

    systemctl restart ollama

    macOS (Homebrew):

    brew services restart ollama

    Windows (if running as a service):

    Restart via Services panel or with PowerShell:

    Restart-Service -Name "Ollama"

❗ Model doesn't show up in ollama list?

In some cases, Ollama may not list the imported model right away. To fix this:

ollama pull mistral:latest

If the files are already present, Ollama will not re-download them — it will just verify their integrity and register the model locally.

📃 License

MIT License

Copyright (c) 2025 efeArdaYildirim

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.