npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@mmmbuto/llama-cpp-termux-tensor

v0.8012.8-termux.338085c6-tensor

Published

Prebuilt llama.cpp for Android Termux (arm64), Pixel/Tensor-optimized build with bundled ggml/llama shared libs.

Downloads

46

Readme

llama.cpp - Termux (Tensor / Pixel)

Upstream llama.cpp, built for Android Termux (arm64) with bundled .so, tuned for Google Pixel / Tensor devices.

npm downloads

Prebuilt llama.cpp binaries for Android Termux (arm64) with bundled shared libraries for maximum CPU performance on Pixel/Tensor.

What This Is

  • Prebuilt llama.cpp binaries for Termux on Android (arm64).
  • Optimized build profile intended for Google Pixel / Tensor devices.
  • This package bundles libllama/libggml* shared libs under lib/ and runs via thin wrappers.

Exact upstream commit + build flags for this specific release are recorded in docs/build_meta.txt.

Commands (No Clobbering Termux PKG)

This package intentionally installs namespaced commands so you can keep the Termux package llama-cpp installed for benchmarks:

  • llama-cli-tensor
  • llama-bench-tensor
  • llama-server-tensor

Installation

pkg update && pkg upgrade -y
pkg install -y nodejs-lts openssl
npm install -g @mmmbuto/llama-cpp-termux-tensor

Automatic Runtime Deps (postinstall)

On Termux, the npm postinstall step will attempt to install missing runtime packages automatically via pkg:

  • openssl (for libssl.so.3 / libcrypto.so.3)
  • libc++ (for libc++_shared.so)

To skip (CI/dry runs):

export LLAMA_CPP_TERMUX_SKIP_PKG_INSTALL=1

Verify

llama-cli-tensor --version
llama-bench-tensor -h

Benchmarks (Pixel 9 Pro)

Bench parameters: threads=6 batch=256 ubatch=256 mmap=1 prompt=512 gen=256 reps=3

| Model (Q4_K_M) | PKG pp512 | TENSOR pp512 | PKG tg256 | TENSOR tg256 | |---|---:|---:|---:|---:| | Llama 3.2 3B | 10.46 | 25.42 | 4.33 | 4.82 | | Gemma3n E2B-it | 12.47 | 30.18 | 5.49 | 6.10 | | Phi 3.5 mini | 6.19 | 17.27 | 4.31 | 4.83 | | SmolLM2 1.7B | 16.62 | 48.79 | 6.98 | 7.88 | | Qwen3 1.7B | 12.85 | 40.59 | 6.55 | 8.05 |

These numbers are provided as a real-world example; your results will vary with temperature/throttling, Android build, and Termux version.

Build (Repro)

See docs/build_meta.txt for the exact build metadata.

High-level build steps:

pkg install -y git cmake ninja clang make openssl

git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
git checkout 338085c69e486b7155e5b03d7b5087e02c0e2528

cmake -S . -B build-android-tensor -DCMAKE_BUILD_TYPE=Release \
  -DGGML_OPENMP=ON -DLLAMA_OPENSSL=ON \
  -DLLAMA_BUILD_SERVER=ON -DLLAMA_BUILD_TOOLS=OFF -DLLAMA_BUILD_TESTS=OFF \
  -DGGML_BACKEND_DL=OFF \
  -DGGML_NATIVE=ON

cmake --build build-android-tensor -j"$(nproc)"

Packaging Details (.so)

This package bundles these libraries under lib/:

  • libggml*.so*, libllama*.so*, libmtmd*.so*

The wrappers in bin/ set LD_LIBRARY_PATH so the binaries use the packaged .so.

License

Upstream llama.cpp is MIT licensed. This package redistributes compiled binaries from upstream and includes the upstream LICENSE.