npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

infinicode

v1.0.0

Published

AI coding agent powered by your local Ollama. Fork of OpenCode optimized for self-hosted LLMs.

Downloads

81

Readme

⚡ Infinicode

AI coding agent powered by your local Ollama. Fork of OpenCode optimized for self-hosted LLMs.

npm install -g infinicode

# Connect to your Ollama master
infinicode connect 192.168.1.100

# Start coding
infinicode

Why Infinicode?

  • 100% Local — All inference on YOUR hardware
  • Zero Cloud — No API keys, no subscriptions, no data leaving your network
  • One Command Setupinfinicode connect <ip> and you're done
  • Multi-Node — Run on any machine, point to one Ollama master

Quick Start

1. Install

npm install -g infinicode
# or
pnpm add -g infinicode

2. Setup Ollama Master

On your GPU machine:

# Allow network connections
OLLAMA_HOST=0.0.0.0:11434 ollama serve

# Pull a coding model
ollama pull qwen2.5-coder:14b

3. Connect & Code

# From any machine on your network
infinicode connect 192.168.1.100   # Your Ollama IP
infinicode                          # Start coding!

Commands

| Command | Alias | Description | |---------|-------|-------------| | infinicode connect <ip> | ic c | Quick connect to Ollama master | | infinicode setup | ic s | Interactive setup wizard | | infinicode run | ic | Start the coding agent (default) | | infinicode models | ic m | List available models | | infinicode status | | Show config & connection status | | infinicode config --list | | View all configuration | | infinicode config --reset | | Reset to defaults |

Configuration

Config is stored automatically. Override with:

# Set default model
infinicode config --set defaultModel=codestral:22b

# Set master URL
infinicode config --set masterUrl=http://192.168.1.100:11434

# View all
infinicode config --list

Recommended Models

For coding tasks on Ollama:

| Model | Size | Best For | |-------|------|----------| | qwen2.5-coder:14b | 9GB | Great balance | | qwen2.5-coder:32b | 20GB | Best quality | | deepseek-coder-v2:16b | 10GB | Strong reasoning | | codestral:22b | 13GB | Fast completion |

Architecture

┌─────────────────┐
│   Laptop        │
│  (infinicode)   │──┐
└─────────────────┘  │
                     │    ┌─────────────────┐
┌─────────────────┐  │    │   GPU Server    │
│   Desktop       │──┼───▶│    (Ollama)     │
│  (infinicode)   │  │    │                 │
└─────────────────┘  │    │ qwen2.5-coder   │
                     │    │ codestral       │
┌─────────────────┐  │    └─────────────────┘
│   Server        │──┘
│  (infinicode)   │
└─────────────────┘

Security

Exposing Ollama to your network means anyone on that network can use it.

Secure options:

  1. VPN/Tailscale — Only accessible on private network
  2. SSH Tunnelssh -L 11434:localhost:11434 gpu-server
  3. Firewall — Allow only specific IPs

Requirements

  • Node.js 20+
  • OpenCode installed (npm install -g opencode)
  • Ollama running somewhere on your network

License

MIT — Based on OpenCode


⚡ Built for sovereign computing. Your code, your models, your hardware.