npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

chatcn-cli

v0.1.0

Published

Scaffold AI chatbot templates into your shadcn project

Downloads

128

Readme

chatcn

Scaffold production-ready AI chatbot templates into your shadcn project.

  • npm: https://www.npmjs.com/package/chatcn-cli
  • GitHub: https://github.com/jeiwinfrey/chatcn

Quickstart

Initialize a chatbot in your existing shadcn project:

npx chatcn-cli init

Prefer a different package runner? Use the one that matches your setup:

npx chatcn-cli init
pnpm dlx chatcn-cli init
yarn dlx chatcn-cli init
bunx chatcn-cli init

This will:

  1. Detect your framework and package manager
  2. Let you choose a chatbot template
  3. Let you choose an AI provider
  4. Let you choose a model, or use the provider's recommended default
  5. Install required shadcn components
  6. Generate all necessary files

Prerequisites

chatcn requires an existing project with shadcn initialized. If you haven't set up shadcn yet:

npx shadcn@latest init

Templates

chatcn provides 5 chatbot templates:

chatbot-basic

Pick this if you are building a simple chatbot and want the smallest starting point.

shadcn components: button, input, scroll-area

chatbot-ui

Pick this if you want a polished chatbot UI with message bubbles, markdown, and loading states.

shadcn components: button, input, scroll-area, card, avatar, skeleton

chatbot-assistant

Pick this if you are building a reusable assistant and want cleaner separation between UI, hook, and LLM logic.

shadcn components: button, input, scroll-area, card, separator

chatbot-support

Pick this if you are building a support or helpdesk chatbot with quick replies and a guided tone.

shadcn components: button, input, scroll-area, card, badge

chatbot-custom

Pick this if you want the simple chatbot starter with optional avatars, names, and loading behavior during setup.

shadcn components: button, input, scroll-area

Providers

chatcn supports 12 AI providers:

  • OpenAI - GPT models (gpt-5-mini default)
  • Anthropic (Claude) - Claude models (claude-3-5-haiku-latest default)
  • OpenRouter - Access to multiple models through one API
  • Google Gemini - Gemini models (gemini-2.5-flash-lite default)
  • AWS Bedrock - Claude and other models via AWS (anthropic.claude-haiku-4-5-20251001-v1:0 default)
  • Groq - Fast inference (meta-llama/llama-4-scout-17b-16e-instruct default)
  • Together AI - Open source models (deepseek-ai/DeepSeek-V3.1 default)
  • Mistral - Mistral models (mistral-small-latest default)
  • xAI (Grok) - Grok models (grok-4.20-beta-latest-non-reasoning default)
  • DeepSeek - DeepSeek models (deepseek-chat default)
  • Cerebras - Ultra-fast inference (gpt-oss-120b default)
  • Fireworks AI - Fast inference (accounts/fireworks/models/kimi-k2-thinking default)

CLI Flags

--template

Specify a template without interactive prompt:

npx chatcn-cli init --template chatbot-ui

Valid values: chatbot-basic, chatbot-ui, chatbot-assistant, chatbot-support, chatbot-custom

--provider

Specify a provider without interactive prompt:

npx chatcn-cli init --provider openai

Valid values: openai, anthropic, openrouter, google, aws-bedrock, groq, together, mistral, xai, deepseek, cerebras, fireworks

--model

Choose the model to write into lib/llm.ts and AI_MODEL:

npx chatcn-cli init --provider openai --model gpt-5.1

If you skip this flag, chatcn uses the provider's recommended default model.

--yes

Skip all prompts and use defaults:

npx chatcn-cli init --yes --template chatbot-basic --provider openai --model gpt-5-mini

--overwrite

Overwrite existing files:

npx chatcn-cli init --overwrite

By default, chatcn will skip files that already exist to protect your custom code.

--cwd

Target a different directory:

npx chatcn-cli init --cwd ./my-project

Commands

init

Initialize a chatbot with interactive prompts:

npx chatcn-cli init

Or with other runners:

pnpm dlx chatcn-cli init
yarn dlx chatcn-cli init
bunx chatcn-cli init

With flags to skip prompts:

npx chatcn-cli init --template chatbot-ui --provider anthropic --yes --model claude-3-5-haiku-latest

add

Add a chatbot template (same as init, but more explicit):

npx chatcn-cli add --template chatbot-assistant --provider openai --model gpt-5-mini

Supported Frameworks

chatcn automatically detects your framework and generates appropriate code:

  • Next.js (App Router and Pages Router)
  • Vite + React
  • Remix
  • Astro
  • TanStack Start
  • React Router v7
  • Laravel (with Inertia)

Environment Variables

After running chatcn, you'll need to set up environment variables for your chosen provider.

OpenAI

OPENAI_API_KEY=your_api_key_here
AI_MODEL=gpt-5-mini

Anthropic

ANTHROPIC_API_KEY=your_api_key_here
AI_MODEL=claude-3-5-haiku-latest

OpenRouter

OPENROUTER_API_KEY=your_api_key_here
AI_MODEL=openrouter/auto

Google Gemini

GOOGLE_API_KEY=your_api_key_here
AI_MODEL=gemini-2.5-flash-lite

AWS Bedrock

AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key_here
AWS_SECRET_ACCESS_KEY=your_secret_key_here
AI_MODEL=anthropic.claude-haiku-4-5-20251001-v1:0

Note: AWS Bedrock requires installing @aws-sdk/client-bedrock-runtime:

npm install @aws-sdk/client-bedrock-runtime

Groq

GROQ_API_KEY=your_api_key_here
AI_MODEL=meta-llama/llama-4-scout-17b-16e-instruct

Together AI

TOGETHER_API_KEY=your_api_key_here
AI_MODEL=deepseek-ai/DeepSeek-V3.1

Mistral

MISTRAL_API_KEY=your_api_key_here
AI_MODEL=mistral-small-latest

xAI (Grok)

XAI_API_KEY=your_api_key_here
AI_MODEL=grok-4.20-beta-latest-non-reasoning

DeepSeek

DEEPSEEK_API_KEY=your_api_key_here
AI_MODEL=deepseek-chat

Cerebras

CEREBRAS_API_KEY=your_api_key_here
AI_MODEL=gpt-oss-120b

Fireworks AI

FIREWORKS_API_KEY=your_api_key_here
AI_MODEL=accounts/fireworks/models/kimi-k2-thinking

Create a .env.local file (Next.js) or .env file (other frameworks) in your project root with the appropriate variables.

Using the Generated Components

After running chatcn, you'll have a chat component ready to use in your application.

Next.js (App Router)

import { Chat } from "@/components/chat";

export default function Page() {
  return (
    <main className="container mx-auto p-4">
      <h1 className="text-2xl font-bold mb-4">AI Chatbot</h1>
      <Chat />
    </main>
  );
}

Next.js (Pages Router)

import { Chat } from "@/components/chat";

export default function Home() {
  return (
    <div className="container mx-auto p-4">
      <h1 className="text-2xl font-bold mb-4">AI Chatbot</h1>
      <Chat />
    </div>
  );
}

Vite

import { Chat } from "@/components/chat";

function App() {
  return (
    <div className="container mx-auto p-4">
      <h1 className="text-2xl font-bold mb-4">AI Chatbot</h1>
      <Chat />
    </div>
  );
}

export default App;

Remix

import { Chat } from "~/components/chat";

export default function Index() {
  return (
    <div className="container mx-auto p-4">
      <h1 className="text-2xl font-bold mb-4">AI Chatbot</h1>
      <Chat />
    </div>
  );
}

What Gets Generated

chatcn generates the following files:

  1. Component files - React components for the chatbot UI
  2. Hook files - React hooks for managing chat state and streaming
  3. LLM file - Provider-specific API integration (lib/llm.ts)
  4. API route - Backend endpoint for handling chat requests (framework-specific)

Example File Structure (Next.js)

your-project/
├── components/
│   ├── chat.tsx              # Main chat component
│   └── chat-message.tsx      # Message component (chatbot-ui only)
├── hooks/
│   └── use-chat.ts           # Chat state management hook
├── lib/
│   └── llm.ts                # Provider API integration
└── app/
    └── api/
        └── chat/
            └── route.ts      # API route handler

Examples

Basic chatbot with OpenAI

npx chatcn-cli init --template chatbot-basic --provider openai --yes

Polished UI with Anthropic

npx chatcn-cli init --template chatbot-ui --provider anthropic --yes

Support chatbot with Groq

npx chatcn-cli init --template chatbot-support --provider groq --yes

Add another template to existing project

npx chatcn-cli add --template chatbot-assistant --provider google --overwrite

Troubleshooting

"shadcn is not initialized"

Run npx shadcn@latest init first to set up shadcn in your project.

"File already exists"

Use the --overwrite flag to replace existing files, or manually remove the files you want to regenerate.

API route not working

Make sure you've set the required environment variables for your provider. Check your .env.local or .env file.

TypeScript errors

Run npm install to ensure all dependencies are installed. For AWS Bedrock, you'll need to manually install @aws-sdk/client-bedrock-runtime.

Development and Testing

Development

Run the built-in checks before opening a release:

npm run typecheck
npm test
npm run build

Contributing

If you want to help improve chatcn, start here:

Running Tests

# Run all automated tests
npm test

# Run tests in watch mode
npm run test:watch

# Build the CLI
npm run build

# Test global installation
./test-global-install.sh

License

MIT. See LICENSE.