npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

frog-ai

v1.0.2

Published

A powerful terminal-based AI assistant for developers, providing intelligent coding assistance directly in your terminal.

Readme

Frog

Features

  • Multi-Model: choose from a wide range of LLMs or add your own via OpenAI- or Anthropic-compatible APIs
  • Flexible: switch LLMs mid-session while preserving context
  • Session-Based: maintain multiple work sessions and contexts per project
  • LSP-Enhanced: Frog uses LSPs for additional context, just like you do
  • Extensible: add capabilities via MCPs (http, stdio, and sse)
  • Works Everywhere: first-class support in every terminal on macOS, Linux, Windows (PowerShell and WSL), FreeBSD, OpenBSD, and NetBSD

Installation

Use a package manager:

# Homebrew (coming soon)
# brew install colygon/tap/frog

# NPM (coming soon)
# npm install -g @colygon/frog

# Arch Linux (btw)
yay -S frog-bin

# Nix
nix run github:numtide/nix-ai-tools#frog

Windows users:

# Winget (coming soon)
# winget install colygon.frog

# Scoop (coming soon)
# scoop bucket add frog https://github.com/colygon/scoop-bucket.git
# scoop install frog

Frog is available via NUR in nur.repos.colygon.frog (coming soon).

You can also try out Frog via nix-shell:

# Add the NUR channel.
nix-channel --add https://github.com/nix-community/NUR/archive/main.tar.gz nur
nix-channel --update

# Get Frog in a Nix shell (coming soon).
# nix-shell -p '(import <nur> { pkgs = import <nixpkgs> {}; }).repos.colygon.frog'

NixOS & Home Manager Module Usage via NUR

Frog provides NixOS and Home Manager modules via NUR. You can use these modules directly in your flake by importing them from NUR. Since it auto detects whether its a home manager or nixos context you can use the import the exact same way :)

{
  inputs = {
    nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
    nur.url = "github:nix-community/NUR";
  };

  outputs = { self, nixpkgs, nur, ... }: {
    nixosConfigurations.your-hostname = nixpkgs.lib.nixosSystem {
      system = "x86_64-linux";
      modules = [
        nur.modules.nixos.default
        nur.repos.colygon.modules.frog  # coming soon
        {
          programs.frog = {
            enable = true;
            settings = {
              providers = {
                openai = {
                  id = "openai";
                  name = "OpenAI";
                  base_url = "https://api.openai.com/v1";
                  type = "openai";
                  api_key = "sk-fake123456789abcdef...";
                  models = [
                    {
                      id = "gpt-4";
                      name = "GPT-4";
                    }
                  ];
                };
              };
              lsp = {
                go = { command = "gopls"; enabled = true; };
                nix = { command = "nil"; enabled = true; };
              };
              options = {
                context_paths = [ "/etc/nixos/configuration.nix" ];
                tui = { compact_mode = true; };
                debug = false;
              };
            };
          };
        }
      ];
    };
  };
}
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg
echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list
sudo apt update && sudo apt install frog
echo '[charm]
name=Charm
baseurl=https://repo.charm.sh/yum/
enabled=1
gpgcheck=1
gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo
sudo yum install frog

Or, download it:

  • Packages are available in Debian and RPM formats
  • Binaries are available for Linux, macOS, Windows, FreeBSD, OpenBSD, and NetBSD

Or just install it with Go:

go install github.com/colygon/frog@latest

[!WARNING] Productivity may increase when using Frog and you may find yourself nerd sniped when first using the application. If the symptoms persist, join the [Discord][discord] and nerd snipe the rest of us.

Getting Started

The quickest way to get started is to grab an API key for your preferred provider such as Anthropic, OpenAI, Groq, or OpenRouter and just start Frog. You'll be prompted to enter your API key.

That said, you can also set environment variables for preferred providers.

| Environment Variable | Provider | | --------------------------- | -------------------------------------------------- | | ANTHROPIC_API_KEY | Anthropic | | OPENAI_API_KEY | OpenAI | | OPENROUTER_API_KEY | OpenRouter | | GEMINI_API_KEY | Google Gemini | | CEREBRAS_API_KEY | Cerebras | | HF_TOKEN | Huggingface Inference | | VERTEXAI_PROJECT | Google Cloud VertexAI (Gemini) | | VERTEXAI_LOCATION | Google Cloud VertexAI (Gemini) | | GROQ_API_KEY | Groq | | AWS_ACCESS_KEY_ID | Amazon Bedrock (Claude) | | AWS_SECRET_ACCESS_KEY | Amazon Bedrock (Claude) | | AWS_REGION | Amazon Bedrock (Claude) | | AWS_PROFILE | Amazon Bedrock (Custom Profile) | | AWS_BEARER_TOKEN_BEDROCK | Amazon Bedrock | | AZURE_OPENAI_API_ENDPOINT | Azure OpenAI models | | AZURE_OPENAI_API_KEY | Azure OpenAI models (optional when using Entra ID) | | AZURE_OPENAI_API_VERSION | Azure OpenAI models |

By the Way

Is there a provider you'd like to see in Frog? Is there an existing model that needs an update?

Frog's default model listing is managed in Catwalk, a community-supported, open source repository maintained by Charmbracelet. This is a shared resource that benefits the entire AI tooling ecosystem, and you're welcome to contribute.

Configuration

Frog runs great with no configuration. That said, if you do need or want to customize Frog, configuration can be added either local to the project itself, or globally, with the following priority:

  1. .frog.json
  2. frog.json
  3. $HOME/.config/frog/frog.json

Configuration itself is stored as a JSON object:

{
  "this-setting": { "this": "that" },
  "that-setting": ["ceci", "cela"]
}

As an additional note, Frog also stores ephemeral data, such as application state, in one additional location:

# Unix
$HOME/.local/share/frog/frog.json

# Windows
%LOCALAPPDATA%\frog\frog.json

LSPs

Frog can use LSPs for additional context to help inform its decisions, just like you would. LSPs can be added manually like so:

{
  "$schema": "https://charm.land/frog.json",
  "lsp": {
    "go": {
      "command": "gopls",
      "env": {
        "GOTOOLCHAIN": "go1.24.5"
      }
    },
    "typescript": {
      "command": "typescript-language-server",
      "args": ["--stdio"]
    },
    "nix": {
      "command": "nil"
    }
  }
}

MCPs

Frog also supports Model Context Protocol (MCP) servers through three transport types: stdio for command-line servers, http for HTTP endpoints, and sse for Server-Sent Events. Environment variable expansion is supported using $(echo $VAR) syntax.

{
  "$schema": "https://charm.land/frog.json",
  "mcp": {
    "filesystem": {
      "type": "stdio",
      "command": "node",
      "args": ["/path/to/mcp-server.js"],
      "timeout": 120,
      "disabled": false,
      "disabled_tools": ["some-tool-name"],
      "env": {
        "NODE_ENV": "production"
      }
    },
    "github": {
      "type": "http",
      "url": "https://api.githubcopilot.com/mcp/",
      "timeout": 120,
      "disabled": false,
      "disabled_tools": ["create_issue", "create_pull_request"],
      "headers": {
        "Authorization": "Bearer $GH_PAT"
      }
    },
    "streaming-service": {
      "type": "sse",
      "url": "https://example.com/mcp/sse",
      "timeout": 120,
      "disabled": false,
      "headers": {
        "API-Key": "$(echo $API_KEY)"
      }
    }
  }
}

Ignoring Files

Frog respects .gitignore files by default, but you can also create a .frogignore file to specify additional files and directories that Frog should ignore. This is useful for excluding files that you want in version control but don't want Frog to consider when providing context.

The .frogignore file uses the same syntax as .gitignore and can be placed in the root of your project or in subdirectories.

Allowing Tools

By default, Frog will ask you for permission before running tool calls. If you'd like, you can allow tools to be executed without prompting you for permissions. Use this with care.

{
  "$schema": "https://charm.land/frog.json",
  "permissions": {
    "allowed_tools": [
      "view",
      "ls",
      "grep",
      "edit",
      "mcp_context7_get-library-doc"
    ]
  }
}

You can also skip all permission prompts entirely by running Frog with the --yolo flag. Be very, very careful with this feature.

Disabling Built-In Tools

If you'd like to prevent Frog from using certain built-in tools entirely, you can disable them via the options.disabled_tools list. Disabled tools are completely hidden from the agent.

{
  "$schema": "https://charm.land/frog.json",
  "options": {
    "disabled_tools": [
      "bash",
      "sourcegraph"
    ]
  }
}

To disable tools from MCP servers, see the MCP config section.

Initialization

When you initialize a project, Frog analyzes your codebase and creates a context file that helps it work more effectively in future sessions. By default, this file is named AGENTS.md, but you can customize the name and location with the initialize_as option:

{
  "$schema": "https://charm.land/frog.json",
  "options": {
    "initialize_as": "AGENTS.md"
  }
}

This is useful if you prefer a different naming convention or want to place the file in a specific directory (e.g., FROG.md or docs/LLMs.md). Frog will fill the file with project-specific context like build commands, code patterns, and conventions it discovered during initialization.

Attribution Settings

By default, Frog adds attribution information to Git commits and pull requests it creates. You can customize this behavior with the attribution option:

{
  "$schema": "https://charm.land/frog.json",
  "options": {
    "attribution": {
      "trailer_style": "co-authored-by",
      "generated_with": true
    }
  }
}
  • trailer_style: Controls the attribution trailer added to commit messages (default: assisted-by)
    • assisted-by: Adds Assisted-by: [Model Name] via Frog <[email protected]> (includes the model name)
    • co-authored-by: Adds Co-Authored-By: Frog <[email protected]>
    • none: No attribution trailer
  • generated_with: When true (default), adds 💘 Generated with Frog line to commit messages and PR descriptions

Custom Providers

Frog supports custom provider configurations for both OpenAI-compatible and Anthropic-compatible APIs.

[!NOTE] Note that we support two "types" for OpenAI. Make sure to choose the right one to ensure the best experience!

  • openai should be used when proxying or routing requests through OpenAI.
  • openai-compat should be used when using non-OpenAI providers that have OpenAI-compatible APIs.

OpenAI-Compatible APIs

Here’s an example configuration for Deepseek, which uses an OpenAI-compatible API. Don't forget to set DEEPSEEK_API_KEY in your environment.

{
  "$schema": "https://charm.land/frog.json",
  "providers": {
    "deepseek": {
      "type": "openai-compat",
      "base_url": "https://api.deepseek.com/v1",
      "api_key": "$DEEPSEEK_API_KEY",
      "models": [
        {
          "id": "deepseek-chat",
          "name": "Deepseek V3",
          "cost_per_1m_in": 0.27,
          "cost_per_1m_out": 1.1,
          "cost_per_1m_in_cached": 0.07,
          "cost_per_1m_out_cached": 1.1,
          "context_window": 64000,
          "default_max_tokens": 5000
        }
      ]
    }
  }
}

Anthropic-Compatible APIs

Custom Anthropic-compatible providers follow this format:

{
  "$schema": "https://charm.land/frog.json",
  "providers": {
    "custom-anthropic": {
      "type": "anthropic",
      "base_url": "https://api.anthropic.com/v1",
      "api_key": "$ANTHROPIC_API_KEY",
      "extra_headers": {
        "anthropic-version": "2023-06-01"
      },
      "models": [
        {
          "id": "claude-sonnet-4-20250514",
          "name": "Claude Sonnet 4",
          "cost_per_1m_in": 3,
          "cost_per_1m_out": 15,
          "cost_per_1m_in_cached": 3.75,
          "cost_per_1m_out_cached": 0.3,
          "context_window": 200000,
          "default_max_tokens": 50000,
          "can_reason": true,
          "supports_attachments": true
        }
      ]
    }
  }
}

Amazon Bedrock

Frog currently supports running Anthropic models through Bedrock, with caching disabled.

  • A Bedrock provider will appear once you have AWS configured, i.e. aws configure
  • Frog also expects the AWS_REGION or AWS_DEFAULT_REGION to be set
  • To use a specific AWS profile set AWS_PROFILE in your environment, i.e. AWS_PROFILE=myprofile frog
  • Alternatively to aws configure, you can also just set AWS_BEARER_TOKEN_BEDROCK

Vertex AI Platform

Vertex AI will appear in the list of available providers when VERTEXAI_PROJECT and VERTEXAI_LOCATION are set. You will also need to be authenticated:

gcloud auth application-default login

To add specific models to the configuration, configure as such:

{
  "$schema": "https://charm.land/frog.json",
  "providers": {
    "vertexai": {
      "models": [
        {
          "id": "claude-sonnet-4@20250514",
          "name": "VertexAI Sonnet 4",
          "cost_per_1m_in": 3,
          "cost_per_1m_out": 15,
          "cost_per_1m_in_cached": 3.75,
          "cost_per_1m_out_cached": 0.3,
          "context_window": 200000,
          "default_max_tokens": 50000,
          "can_reason": true,
          "supports_attachments": true
        }
      ]
    }
  }
}

Local Models

Local models can also be configured via OpenAI-compatible API. Here are two common examples:

Ollama

{
  "providers": {
    "ollama": {
      "name": "Ollama",
      "base_url": "http://localhost:11434/v1/",
      "type": "openai-compat",
      "models": [
        {
          "name": "Qwen 3 30B",
          "id": "qwen3:30b",
          "context_window": 256000,
          "default_max_tokens": 20000
        }
      ]
    }
  }
}

LM Studio

{
  "providers": {
    "lmstudio": {
      "name": "LM Studio",
      "base_url": "http://localhost:1234/v1/",
      "type": "openai-compat",
      "models": [
        {
          "name": "Qwen 3 30B",
          "id": "qwen/qwen3-30b-a3b-2507",
          "context_window": 256000,
          "default_max_tokens": 20000
        }
      ]
    }
  }
}

Logging

Sometimes you need to look at logs. Luckily, Frog logs all sorts of stuff. Logs are stored in ./.frog/logs/frog.log relative to the project.

The CLI also contains some helper commands to make perusing recent logs easier:

# Print the last 1000 lines
frog logs

# Print the last 500 lines
frog logs --tail 500

# Follow logs in real time
frog logs --follow

Want more logging? Run frog with the --debug flag, or enable it in the config:

{
  "$schema": "https://charm.land/frog.json",
  "options": {
    "debug": true,
    "debug_lsp": true
  }
}

Provider Auto-Updates

By default, Frog automatically checks for the latest and greatest list of providers and models from Catwalk, the open source model database maintained by Charmbracelet. This means that when new providers and models are available, or when model metadata changes, Frog automatically updates your local configuration.

Disabling automatic provider updates

For those with restricted internet access, or those who prefer to work in air-gapped environments, this might not be want you want, and this feature can be disabled.

To disable automatic provider updates, set disable_provider_auto_update into your frog.json config:

{
  "$schema": "https://charm.land/frog.json",
  "options": {
    "disable_provider_auto_update": true
  }
}

Or set the FROG_DISABLE_PROVIDER_AUTO_UPDATE environment variable:

export FROG_DISABLE_PROVIDER_AUTO_UPDATE=1

Manually updating providers

Manually updating providers is possible with the frog update-providers command:

# Update providers remotely from Catwalk.
frog update-providers

# Update providers from a custom Catwalk base URL.
frog update-providers https://example.com/

# Update providers from a local file.
frog update-providers /path/to/local-providers.json

# Reset providers to the embedded version, embedded at frog at build time.
frog update-providers embedded

# For more info:
frog update-providers --help

Metrics

Frog records pseudonymous usage metrics (tied to a device-specific hash), which maintainers rely on to inform development and support priorities. The metrics include solely usage metadata; prompts and responses are NEVER collected.

Details on exactly what's collected are in the source code (here and here).

You can opt out of metrics collection at any time by setting the environment variable by setting the following in your environment:

export FROG_DISABLE_METRICS=1

Or by setting the following in your config:

{
  "options": {
    "disable_metrics": true
  }
}

Frog also respects the DO_NOT_TRACK convention which can be enabled via export DO_NOT_TRACK=1.

Contributing

See the contributing guide.

Whatcha think?

We'd love to hear your thoughts on this project. Feel free to open an issue or discussion on GitHub.

License

FSL-1.1-MIT


Acknowledgments

Frog builds upon the excellent work of Charmbracelet and uses many of their fantastic terminal UI libraries. Special thanks to the Charm team for their contributions to the open source community.