npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cgpu

v0.1.4

Published

Free cloud GPUs for learning CUDA

Readme

CLI enabling Free Cloud GPU access in your terminal for learning CUDA

nvcc demo

# Install cgpu
npm i -g cgpu
# First run will launch an interactive setup wizard
# Connect to a cloud GPU instance quickly without setup any time after that
cgpu connect
# Run a command on a cloud GPU instance without a persistent terminal (but mantaining file system state)
cgpu run nvidia-smi 

Serve Gemini for Free as OpenAI-compatible API

You can start a local server that proxies requests to Google Gemini using the cgpu serve command. This allows you to use Gemini with tools that expect an OpenAI-compatible API.

# Start the server on port 8080
cgpu serve

# Specify port and model
cgpu serve --port 3000 --default-model gemini-2.0-flash

For an example of using this with the OpenAI client, check out python_example. This requires you have the gemini cli installed.

Vision


https://github.com/user-attachments/assets/93158031-24fd-4a63-a4cb-1164bea383c3

### Vision
The primary goal of this project to facilitate a high quality developer experience for those without GPUs who would like to learn CUDA C++
This means 3 main things:
1. Free: Avoid having to pay while learning.
2. Highly Available: Run quickly instead of having to wait in a queue so that users can compile quickly and learn faster.
3. In User Terminal: Allows developers to use their own devtools/IDEs (Neovim, Cursor, etc) so they can be most productive.

### Next Steps
I will continue to add to the CLI as I find more free compute sources and developer experience improvements.
To see what I am currently planning to add check out the Issues tab on Github.
Feel free to create new Issues for suggestions/problems you run into while learning!