electron-ollama
v0.1.25
Published
Bundle Ollama with your Electron.js app for seamless user experience
Maintainers
Readme
electron-ollama
A TypeScript library for integrating Ollama with Electron.js applications. This library provides a seamless way to bundle and manage Ollama within your Electron app for a better user experience.
Why
Because every extra installation step creates friction, bundling Ollama ensures a smooth, seamless experience. With electron-ollama, users skip the hassle of finding installers or running commands — no separate Ollama setup required. It detects existing Ollama instance or installs automatically if missing, so users simply open your app and it works.
Features
- 🛡️ No conflict: Works well with standalone Ollama server (skips installation if Ollama already runs)
- 🤝 Maximum compatibility: Can be imported by ESM and CommonJS packages
- 🚀 TypeScript Support: Full TypeScript support with type definitions
- 🔧 Easy Integration: Simple API for integrating Ollama with Electron apps
- 📦 Binaries Management: Automatically find and manage Ollama executables
- 🌐 Cross-Platform: Tested on Windows, macOS, and Linux
Installation
npm install electron-ollamaQuick Start - Serve latest version if standalone Ollama is not running
import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'
async function main() {
const eo = new ElectronOllama({
basePath: app.getPath('userData'),
})
if (!(await eo.isRunning())) {
const metadata = await eo.getMetadata('latest')
await eo.serve(metadata.version, {
serverLog: (message) => console.log('[Ollama]', message),
downloadLog: (percent, message) => console.log('[Ollama Download]', `${percent}%`, message)
})
} else {
console.log('Ollama server is already running')
}
}
main()Try it
npm run build
npx tsx examples/serve-latest.tsConfiguration
| Option | Type | Required | Default | Description |
|--------|------|----------|---------|-------------|
| basePath | string | Yes | - | The base directory where Ollama binaries will be downloaded and stored. Typically app.getPath('userData') in Electron apps. |
| directory | string | No | 'electron-ollama' | Subdirectory name within basePath where Ollama versions will be organized. Final path structure: {basePath}/{directory}/{version}/{os}/{arch}/ |
Examples
Serve specific version of Ollama
import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'
async function main() {
const eo = new ElectronOllama({
basePath: app.getPath('userData'),
})
if (!(await eo.isRunning())) {
// Welcome OpenAI's gpt-oss models
await eo.serve('v0.11.0', {
serverLog: (message) => console.log('[Ollama]', message),
downloadLog: (percent, message) => console.log('[Ollama Download]', `${percent}%`, message)
})
const liveVersion = await fetch('http://localhost:11434/api/version').then(res => res.json())
console.log('Currently running Ollama', liveVersion)
await eo.getServer()?.stop() // gracefully stop the server with 5s timeout
} else {
console.log('Ollama server is already running')
}
}
main()Try it
npm run build
npx tsx examples/serve-version.tsDownload for multiple platforms
import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'
async function main() {
const eo = new ElectronOllama({
basePath: app.getPath('userData'),
})
const metadata = await eo.getMetadata('latest')
await eo.download(metadata.version, { os: 'windows', arch: 'arm64' })
await eo.download(metadata.version, { os: 'darwin', arch: 'arm64' })
await eo.download(metadata.version, { os: 'linux', arch: 'arm64' })
console.log('Downloaded', metadata.version, 'for windows, mac and linux')
}
main()Try it
npm run build
npx tsx examples/download-windows-mac-linux.tsList downloaded versions
import { ElectronOllama } from '../dist' // replace with: import { ElectronOllama } from 'electron-ollama'
import { app } from './mock/electron' // on electron app replace with: import { app } from 'electron'
async function main() {
const eo = new ElectronOllama({
basePath: app.getPath('userData'),
})
const currentVersion = await eo.downloadedVersions()
console.log('current platform versions', currentVersion) // [ 'v0.11.0', 'v0.11.2', 'v0.11.4' ]
const windowsVersions = await eo.downloadedVersions({ os: 'windows', arch: 'arm64' })
console.log('windows versions', windowsVersions) // [ 'v0.11.4' ]
}
main()Try it
npm run build
npx tsx examples/list-downloaded.tsAPI Reference
import { ElectronOllamaConfig, OllamaServerConfig, PlatformConfig, OllamaAssetMetadata, SpecificVersion, Version } from './types';
import { ElectronOllamaServer } from './server';
export type { ElectronOllamaConfig, OllamaServerConfig, PlatformConfig, OllamaAssetMetadata, SpecificVersion, Version };
export { ElectronOllamaServer };
export declare class ElectronOllama {
private config;
private server;
constructor(config: ElectronOllamaConfig);
/**
* Get the current platform configuration
*/
currentPlatformConfig(): PlatformConfig;
/**
* Get the name of the asset for the given platform configuration (e.g. "ollama-windows-amd64.zip" or "ollama-darwin.tgz")
*/
getAssetName(platformConfig: PlatformConfig): string;
/**
* Get metadata for a specific version ('latest' by default) and platform
*/
getMetadata(version?: Version, platformConfig?: PlatformConfig): Promise<OllamaAssetMetadata>;
/**
* Download Ollama for the specified version ('latest' by default) and platform
*/
download(version?: Version, platformConfig?: PlatformConfig, { log }?: {
log?: (percent: number, message: string) => void;
}): Promise<void>;
/**
* Check if a version is downloaded for the given platform configuration
*/
isDownloaded(version: SpecificVersion, platformConfig?: PlatformConfig): Promise<boolean>;
/**
* List all downloaded versions for the given platform configuration
*/
downloadedVersions(platformConfig?: PlatformConfig): Promise<string[]>;
/**
* Get the path to the directory for the given version and platform configuration
*/
getBinPath(version: SpecificVersion, platformConfig?: PlatformConfig): string;
/**
* Get the name of the executable for the given platform configuration
*/
getExecutableName(platformConfig: PlatformConfig): string;
/**
* Start serving Ollama with the specified version and wait until it is running
*/
serve(version: SpecificVersion, { serverLog, downloadLog, timeoutSec }?: {
serverLog?: (message: string) => void;
downloadLog?: (percent: number, message: string) => void;
timeoutSec?: number;
}): Promise<void>;
/**
* Get the server instance started by serve()
*/
getServer(): ElectronOllamaServer | null;
/**
* Check if Ollama is running
*/
isRunning(): Promise<boolean>;
}
export default ElectronOllama;
//# sourceMappingURL=index.d.ts.mapOllama Clients
Notes
- While the primary use case of this package is to seamlessly integrate Ollama with an Electron app, this package intentionally doesn't have a dependency on Electron itself. By simply providing a different
basePaththan the electronapp.getPath('userData')you can manage Ollama process on virtually any NodeJS app. - This library does not modify Ollama binaries. The Ollama server is provided as is. electron-ollama is merely a convenience library to pick the appropriate binary for os/arch and start the server if needed.
- You can use electron-ollama as runtime dependency to manage LLM backend in the app or you can use it as part of your prebuild script to ship Ollama binaries with your app.
TODO
- Detect AMD ROCM support and support additional platform variants like jetpack
- Investigate if any prerequisites are required to be installed first like vc_redist.exe on windows
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Run the test suite
- Submit a pull request
License
MIT License - see LICENSE file for details.
Support
If you encounter any issues or have questions, please open an issue on GitHub.
