npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

electron-ollama

v0.1.25

Published

Bundle Ollama with your Electron.js app for seamless user experience

Readme

electron-ollama

A TypeScript library for integrating Ollama with Electron.js applications. This library provides a seamless way to bundle and manage Ollama within your Electron app for a better user experience.

npm version github action

Why

Because every extra installation step creates friction, bundling Ollama ensures a smooth, seamless experience. With electron-ollama, users skip the hassle of finding installers or running commands — no separate Ollama setup required. It detects existing Ollama instance or installs automatically if missing, so users simply open your app and it works.

Features

  • 🛡️ No conflict: Works well with standalone Ollama server (skips installation if Ollama already runs)
  • 🤝 Maximum compatibility: Can be imported by ESM and CommonJS packages
  • 🚀 TypeScript Support: Full TypeScript support with type definitions
  • 🔧 Easy Integration: Simple API for integrating Ollama with Electron apps
  • 📦 Binaries Management: Automatically find and manage Ollama executables
  • 🌐 Cross-Platform: Tested on Windows, macOS, and Linux

Installation

npm install electron-ollama

Quick Start - Serve latest version if standalone Ollama is not running

import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'

async function main() {
  const eo = new ElectronOllama({
    basePath: app.getPath('userData'),
  })

  if (!(await eo.isRunning())) {
    const metadata = await eo.getMetadata('latest')
    await eo.serve(metadata.version, {
      serverLog: (message) => console.log('[Ollama]', message),
      downloadLog: (percent, message) => console.log('[Ollama Download]', `${percent}%`, message)
    })
  } else {
    console.log('Ollama server is already running')
  }
}

main()

Try it

npm run build
npx tsx examples/serve-latest.ts

Configuration

| Option | Type | Required | Default | Description | |--------|------|----------|---------|-------------| | basePath | string | Yes | - | The base directory where Ollama binaries will be downloaded and stored. Typically app.getPath('userData') in Electron apps. | | directory | string | No | 'electron-ollama' | Subdirectory name within basePath where Ollama versions will be organized. Final path structure: {basePath}/{directory}/{version}/{os}/{arch}/ |

Examples

Serve specific version of Ollama

import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'

async function main() {
  const eo = new ElectronOllama({
    basePath: app.getPath('userData'),
  })

  if (!(await eo.isRunning())) {
    // Welcome OpenAI's gpt-oss models
    await eo.serve('v0.11.0', {
      serverLog: (message) => console.log('[Ollama]', message),
      downloadLog: (percent, message) => console.log('[Ollama Download]', `${percent}%`, message)
    })

    const liveVersion = await fetch('http://localhost:11434/api/version').then(res => res.json())
    console.log('Currently running Ollama', liveVersion)

    await eo.getServer()?.stop() // gracefully stop the server with 5s timeout
  } else {
    console.log('Ollama server is already running')
  }
}

main()

Try it

npm run build
npx tsx examples/serve-version.ts

Download for multiple platforms

import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'

async function main() {
  const eo = new ElectronOllama({
    basePath: app.getPath('userData'),
  })

  const metadata = await eo.getMetadata('latest')

  await eo.download(metadata.version, { os: 'windows', arch: 'arm64' })
  await eo.download(metadata.version, { os: 'darwin', arch: 'arm64' })
  await eo.download(metadata.version, { os: 'linux', arch: 'arm64' })

  console.log('Downloaded', metadata.version, 'for windows, mac and linux')
}

main()

Try it

npm run build
npx tsx examples/download-windows-mac-linux.ts

List downloaded versions

import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'

async function main() {
  const eo = new ElectronOllama({
    basePath: app.getPath('userData'),
  })

  const currentVersion = await eo.downloadedVersions()
  console.log('current platform versions', currentVersion) // [ 'v0.11.0', 'v0.11.2', 'v0.11.4' ]
  const windowsVersions = await eo.downloadedVersions({ os: 'windows', arch: 'arm64' })
  console.log('windows versions', windowsVersions) // [ 'v0.11.4' ]
}

main()

Try it

npm run build
npx tsx examples/list-downloaded.ts

API Reference

import { ElectronOllamaConfig, OllamaServerConfig, PlatformConfig, OllamaAssetMetadata, SpecificVersion, Version } from './types';
import { ElectronOllamaServer } from './server';
export type { ElectronOllamaConfig, OllamaServerConfig, PlatformConfig, OllamaAssetMetadata, SpecificVersion, Version };
export { ElectronOllamaServer };
export declare class ElectronOllama {
    private config;
    private server;
    constructor(config: ElectronOllamaConfig);
    /**
     * Get the current platform configuration
     */
    currentPlatformConfig(): PlatformConfig;
    /**
     * Get the name of the asset for the given platform configuration (e.g. "ollama-windows-amd64.zip" or "ollama-darwin.tgz")
     */
    getAssetName(platformConfig: PlatformConfig): string;
    /**
     * Get metadata for a specific version ('latest' by default) and platform
     */
    getMetadata(version?: Version, platformConfig?: PlatformConfig): Promise<OllamaAssetMetadata>;
    /**
     * Download Ollama for the specified version ('latest' by default) and platform
     */
    download(version?: Version, platformConfig?: PlatformConfig, { log }?: {
        log?: (percent: number, message: string) => void;
    }): Promise<void>;
    /**
     * Check if a version is downloaded for the given platform configuration
     */
    isDownloaded(version: SpecificVersion, platformConfig?: PlatformConfig): Promise<boolean>;
    /**
     * List all downloaded versions for the given platform configuration
     */
    downloadedVersions(platformConfig?: PlatformConfig): Promise<string[]>;
    /**
     * Get the path to the directory for the given version and platform configuration
     */
    getBinPath(version: SpecificVersion, platformConfig?: PlatformConfig): string;
    /**
     * Get the name of the executable for the given platform configuration
     */
    getExecutableName(platformConfig: PlatformConfig): string;
    /**
     * Start serving Ollama with the specified version and wait until it is running
     */
    serve(version: SpecificVersion, { serverLog, downloadLog, timeoutSec }?: {
        serverLog?: (message: string) => void;
        downloadLog?: (percent: number, message: string) => void;
        timeoutSec?: number;
    }): Promise<void>;
    /**
     * Get the server instance started by serve()
     */
    getServer(): ElectronOllamaServer | null;
    /**
     * Check if Ollama is running
     */
    isRunning(): Promise<boolean>;
}
export default ElectronOllama;
//# sourceMappingURL=index.d.ts.map

Ollama Clients

Notes

  • While the primary use case of this package is to seamlessly integrate Ollama with an Electron app, this package intentionally doesn't have a dependency on Electron itself. By simply providing a different basePath than the electron app.getPath('userData') you can manage Ollama process on virtually any NodeJS app.
  • This library does not modify Ollama binaries. The Ollama server is provided as is. electron-ollama is merely a convenience library to pick the appropriate binary for os/arch and start the server if needed.
  • You can use electron-ollama as runtime dependency to manage LLM backend in the app or you can use it as part of your prebuild script to ship Ollama binaries with your app.

TODO

  • Detect AMD ROCM support and support additional platform variants like jetpack
  • Investigate if any prerequisites are required to be installed first like vc_redist.exe on windows

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Run the test suite
  6. Submit a pull request

License

MIT License - see LICENSE file for details.

Support

If you encounter any issues or have questions, please open an issue on GitHub.