npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@luizguil99/prismium

v1.0.2

Published

Prismium - A powerful terminal-based AI assistant for developers

Downloads

20

Readme

Crush

Features

  • Multi-Model: choose from a wide range of LLMs or add your own via OpenAI- or Anthropic-compatible APIs
  • Flexible: switch LLMs mid-session while preserving context
  • Session-Based: maintain multiple work sessions and contexts per project
  • LSP-Enhanced: Crush uses LSPs for additional context, just like you do
  • Native Terminal: execute shell commands directly with !command syntax without sending to AI
  • Extensible: add capabilities via MCPs (http, stdio, and sse)
  • Works Everywhere: first-class support in every terminal on macOS, Linux, Windows (PowerShell and WSL), FreeBSD, OpenBSD, and NetBSD

Installation

Use a package manager:

# Homebrew
brew install charmbracelet/tap/crush

# NPM
npm install -g @charmland/crush

# Arch Linux (btw)
yay -S crush-bin

# Nix
nix run github:numtide/nix-ai-tools#crush

Windows users:

# Winget
winget install charmbracelet.crush

# Scoop
scoop bucket add charm https://github.com/charmbracelet/scoop-bucket.git
scoop install crush

Crush is available via NUR in nur.repos.charmbracelet.crush.

You can also try out Crush via nix-shell:

# Add the NUR channel.
nix-channel --add https://github.com/nix-community/NUR/archive/main.tar.gz nur
nix-channel --update

# Get Crush in a Nix shell.
nix-shell -p '(import <nur> { pkgs = import <nixpkgs> {}; }).repos.charmbracelet.crush'
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg
echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list
sudo apt update && sudo apt install crush
echo '[charm]
name=Charm
baseurl=https://repo.charm.sh/yum/
enabled=1
gpgcheck=1
gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo
sudo yum install crush

Or, download it:

  • Packages are available in Debian and RPM formats
  • Binaries are available for Linux, macOS, Windows, FreeBSD, OpenBSD, and NetBSD

Or just install it with Go:

go install github.com/charmbracelet/crush@latest

[!WARNING] Productivity may increase when using Crush and you may find yourself nerd sniped when first using the application. If the symptoms persist, join the Discord and nerd snipe the rest of us.

Getting Started

The quickest way to get started is to grab an API key for your preferred provider such as Anthropic, OpenAI, Groq, or OpenRouter and just start Crush. You'll be prompted to enter your API key.

That said, you can also set environment variables for preferred providers.

| Environment Variable | Provider | | -------------------------- | -------------------------------------------------- | | ANTHROPIC_API_KEY | Anthropic | | OPENAI_API_KEY | OpenAI | | OPENROUTER_API_KEY | OpenRouter | | GEMINI_API_KEY | Google Gemini | | VERTEXAI_PROJECT | Google Cloud VertexAI (Gemini) | | VERTEXAI_LOCATION | Google Cloud VertexAI (Gemini) | | GROQ_API_KEY | Groq | | AWS_ACCESS_KEY_ID | AWS Bedrock (Claude) | | AWS_SECRET_ACCESS_KEY | AWS Bedrock (Claude) | | AWS_REGION | AWS Bedrock (Claude) | | AZURE_OPENAI_ENDPOINT | Azure OpenAI models | | AZURE_OPENAI_API_KEY | Azure OpenAI models (optional when using Entra ID) | | AZURE_OPENAI_API_VERSION | Azure OpenAI models |

By the Way

Is there a provider you’d like to see in Crush? Is there an existing model that needs an update?

Crush’s default model listing is managed in Catwalk, a community-supported, open source repository of Crush-compatible models, and you’re welcome to contribute.

Native Terminal Commands

Crush supports executing shell commands directly in the terminal without sending them to the AI. This is perfect for quick file operations, checking system status, or running development tools.

Usage

Simply prefix any command with ! to execute it natively:

!ls                    # List files and directories
!pwd                   # Show current directory  
!cat package.json      # Display file contents
!git status            # Check git status
!npm test              # Run tests
!ps aux | grep node    # Find Node processes

Keyboard Shortcuts

  • Ctrl+T: Quickly add ! prefix to the current input
  • Ctrl+P: Open commands dialog (existing functionality)
  • Enter: Execute the command when prefixed with !

Built-in Help

Type !help to see a list of common commands and usage instructions.

Benefits

  • Fast: No AI processing delay for simple commands
  • Integrated: Output appears directly in your chat history
  • Contextual: Commands run in your current working directory
  • Familiar: Works with any shell command available on your system

Configuration

Crush runs great with no configuration. That said, if you do need or want to customize Crush, configuration can be added either local to the project itself, or globally, with the following priority:

  1. .crush.json
  2. crush.json
  3. $HOME/.config/crush/crush.json (Windows: %USERPROFILE%\AppData\Local\crush\crush.json)

Configuration itself is stored as a JSON object:

{
   "this-setting": {"this": "that"},
   "that-setting": ["ceci", "cela"]
}

As an additional note, Crush also stores ephemeral data, such as application state, in one additional location:

# Unix
$HOME/.local/share/crush/crush.json

# Windows
%LOCALAPPDATA%\crush\crush.json

LSPs

Crush can use LSPs for additional context to help inform its decisions, just like you would. LSPs can be added manually like so:

{
  "$schema": "https://charm.land/crush.json",
  "lsp": {
    "go": {
      "command": "gopls",
      "env": {
        "GOTOOLCHAIN": "go1.24.5"
      }
    },
    "typescript": {
      "command": "typescript-language-server",
      "args": ["--stdio"]
    },
    "nix": {
      "command": "nil"
    }
  }
}

MCPs

Crush also supports Model Context Protocol (MCP) servers through three transport types: stdio for command-line servers, http for HTTP endpoints, and sse for Server-Sent Events. Environment variable expansion is supported using $(echo $VAR) syntax.

{
  "$schema": "https://charm.land/crush.json",
  "mcp": {
    "filesystem": {
      "type": "stdio",
      "command": "node",
      "args": ["/path/to/mcp-server.js"],
      "env": {
        "NODE_ENV": "production"
      }
    },
    "github": {
      "type": "http",
      "url": "https://example.com/mcp/",
      "headers": {
        "Authorization": "$(echo Bearer $EXAMPLE_MCP_TOKEN)"
      }
    },
    "streaming-service": {
      "type": "sse",
      "url": "https://example.com/mcp/sse",
      "headers": {
        "API-Key": "$(echo $API_KEY)"
      }
    }
  }
}

Ignoring Files

Crush respects .gitignore files by default, but you can also create a .crushignore file to specify additional files and directories that Crush should ignore. This is useful for excluding files that you want in version control but don't want Crush to consider when providing context.

The .crushignore file uses the same syntax as .gitignore and can be placed in the root of your project or in subdirectories.

Allowing Tools

By default, Crush will ask you for permission before running tool calls. If you'd like, you can allow tools to be executed without prompting you for permissions. Use this with care.

{
  "$schema": "https://charm.land/crush.json",
  "permissions": {
    "allowed_tools": [
      "view",
      "ls",
      "grep",
      "edit",
      "mcp_context7_get-library-doc"
    ]
  }
}

You can also skip all permission prompts entirely by running Crush with the --yolo flag. Be very, very careful with this feature.

Local Models

Local models can also be configured via OpenAI-compatible API. Here are two common examples:

Ollama

{
  "providers": {
    "ollama": {
      "name": "Ollama",
      "base_url": "http://localhost:11434/v1/",
      "type": "openai",
      "models": [
        {
          "name": "Qwen 3 30B",
          "id": "qwen3:30b",
          "context_window": 256000,
          "default_max_tokens": 20000
        }
      ]
    }
  }
}

LM Studio

{
  "providers": {
    "lmstudio": {
      "name": "LM Studio",
      "base_url": "http://localhost:1234/v1/",
      "type": "openai",
      "models": [
        {
          "name": "Qwen 3 30B",
          "id": "qwen/qwen3-30b-a3b-2507",
          "context_window": 256000,
          "default_max_tokens": 20000
        }
      ]
    }
  }
}

Custom Providers

Crush supports custom provider configurations for both OpenAI-compatible and Anthropic-compatible APIs.

OpenAI-Compatible APIs

Here’s an example configuration for Deepseek, which uses an OpenAI-compatible API. Don't forget to set DEEPSEEK_API_KEY in your environment.

{
  "$schema": "https://charm.land/crush.json",
  "providers": {
    "deepseek": {
      "type": "openai",
      "base_url": "https://api.deepseek.com/v1",
      "api_key": "$DEEPSEEK_API_KEY",
      "models": [
        {
          "id": "deepseek-chat",
          "name": "Deepseek V3",
          "cost_per_1m_in": 0.27,
          "cost_per_1m_out": 1.1,
          "cost_per_1m_in_cached": 0.07,
          "cost_per_1m_out_cached": 1.1,
          "context_window": 64000,
          "default_max_tokens": 5000
        }
      ]
    }
  }
}

Anthropic-Compatible APIs

Custom Anthropic-compatible providers follow this format:

{
  "$schema": "https://charm.land/crush.json",
  "providers": {
    "custom-anthropic": {
      "type": "anthropic",
      "base_url": "https://api.anthropic.com/v1",
      "api_key": "$ANTHROPIC_API_KEY",
      "extra_headers": {
        "anthropic-version": "2023-06-01"
      },
      "models": [
        {
          "id": "claude-sonnet-4-20250514",
          "name": "Claude Sonnet 4",
          "cost_per_1m_in": 3,
          "cost_per_1m_out": 15,
          "cost_per_1m_in_cached": 3.75,
          "cost_per_1m_out_cached": 0.3,
          "context_window": 200000,
          "default_max_tokens": 50000,
          "can_reason": true,
          "supports_attachments": true
        }
      ]
    }
  }
}

Amazon Bedrock

Crush currently supports running Anthropic models through Bedrock, with caching disabled.

  • A Bedrock provider will appear once you have AWS configured, i.e. aws configure
  • Crush also expects the AWS_REGION or AWS_DEFAULT_REGION to be set
  • To use a specific AWS profile set AWS_PROFILE in your environment, i.e. AWS_PROFILE=myprofile crush

Vertex AI Platform

Vertex AI will appear in the list of available providers when VERTEXAI_PROJECT and VERTEXAI_LOCATION are set. You will also need to be authenticated:

gcloud auth application-default login

To add specific models to the configuration, configure as such:

{
  "$schema": "https://charm.land/crush.json",
  "providers": {
    "vertexai": {
      "models": [
        {
          "id": "claude-sonnet-4@20250514",
          "name": "VertexAI Sonnet 4",
          "cost_per_1m_in": 3,
          "cost_per_1m_out": 15,
          "cost_per_1m_in_cached": 3.75,
          "cost_per_1m_out_cached": 0.3,
          "context_window": 200000,
          "default_max_tokens": 50000,
          "can_reason": true,
          "supports_attachments": true
        }
      ]
    }
  }
}

A Note on Claude Max and GitHub Copilot

Crush only supports model providers through official, compliant APIs. We do not support or endorse any methods that rely on personal Claude Max and GitHub Copilot accounts or OAuth workarounds, which may violate Anthropic and Microsoft’s Terms of Service.

We’re committed to building sustainable, trusted integrations with model providers. If you’re a provider interested in working with us, reach out.

Logging

Sometimes you need to look at logs. Luckily, Crush logs all sorts of stuff. Logs are stored in ./.crush/logs/crush.log relative to the project.

The CLI also contains some helper commands to make perusing recent logs easier:

# Print the last 1000 lines
crush logs

# Print the last 500 lines
crush logs --tail 500

# Follow logs in real time
crush logs --follow

Want more logging? Run crush with the --debug flag, or enable it in the config:

{
  "$schema": "https://charm.land/crush.json",
  "options": {
    "debug": true,
    "debug_lsp": true
  }
}

Whatcha think?

We’d love to hear your thoughts on this project. Need help? We gotchu. You can find us on:

License

Crush is open source and welcomes contributions.

FSL-1.1-MIT


Part of Charm.

Charm热爱开源 • Charm loves open source