npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@fyit/crouton-ai

v0.1.0

Published

AI integration layer for Nuxt Crouton

Readme

@fyit/crouton-ai

AI integration layer for Nuxt Crouton applications. Provides multi-provider AI chat with streaming support, built on Vercel AI SDK.

Features

  • Multi-Provider Support - OpenAI, Anthropic Claude via unified API
  • Streaming Chat - Real-time token streaming with useChat composable
  • UI Components - Ready-to-use chat components built with Nuxt UI
  • Server Utilities - Provider factory for server-side AI calls
  • Persistence Schema - Collection schema for the crouton generator

Installation

pnpm add @fyit/crouton-ai

Add to your nuxt.config.ts:

export default defineNuxtConfig({
  extends: ['@fyit/crouton-ai']
})

Configuration

Set your API keys in .env:

OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...

# Optional: Set defaults
NUXT_PUBLIC_CROUTON_AI_DEFAULT_PROVIDER=openai
NUXT_PUBLIC_CROUTON_AI_DEFAULT_MODEL=gpt-4o

Quick Start

Basic Chat Interface

<template>
  <AIChatbox class="h-96" />
</template>

With Custom Endpoint

<template>
  <AIChatbox
    api="/api/my-assistant/chat"
    placeholder="Ask me anything..."
    empty-message="How can I help you today?"
  />
</template>

Create a Server Endpoint

// server/api/ai/chat.post.ts
// createAIProvider is auto-imported when extending the layer
// streamText comes from the 'ai' package (Vercel AI SDK)
import { streamText } from 'ai'

export default defineEventHandler(async (event) => {
  const { messages, model } = await readBody(event)

  const ai = createAIProvider(event)

  const result = await streamText({
    model: ai.model(model || 'gpt-4o'),
    messages,
    system: 'You are a helpful assistant.'
  })

  return result.toDataStreamResponse()
})

Components

AIChatbox

Full chat interface combining message list and input.

<AIChatbox
  api="/api/ai/chat"
  placeholder="Type a message..."
  empty-message="Start a conversation..."
/>

Props:

  • api - API endpoint for chat (default: /api/ai/chat)
  • placeholder - Input placeholder text
  • emptyMessage - Message shown when no messages
  • systemPrompt - System prompt to include

AIMessage

Single message bubble component.

<AIMessage
  :message="{ id: '1', role: 'user', content: 'Hello!' }"
  :is-streaming="false"
/>

Props:

  • message - Message object with id, role, content
  • isStreaming - Show streaming indicator

AIInput

Message input with send button.

<AIInput
  v-model="input"
  :loading="isLoading"
  placeholder="Type here..."
  @submit="handleSubmit"
/>

Props:

  • modelValue - Input text (v-model)
  • loading - Show loading state
  • placeholder - Placeholder text
  • disabled - Disable input

Events:

  • update:modelValue - Input changed
  • submit - Send button clicked or Enter pressed

Composables

useChat

Wraps AI SDK's useChat with crouton-specific features.

const {
  messages,
  input,
  handleSubmit,
  isLoading,
  error,
  stop,
  reload,
  setMessages,
  append,
  clearMessages,
  exportMessages,
  importMessages
} = useChat({
  api: '/api/ai/chat',
  provider: 'openai',
  model: 'gpt-4o',
  onFinish: (message) => console.log('Done:', message),
  onError: (error) => console.error('Error:', error)
})

useCompletion

For single-turn text generation.

const {
  completion,
  input,
  handleSubmit,
  isLoading
} = useCompletion({
  api: '/api/ai/completion'
})

useAIProvider

Access provider configuration.

const {
  defaultProvider,
  defaultModel,
  providers
} = useAIProvider()

Server Utilities

All server utilities are auto-imported when extending the layer.

createAIProvider

Factory function for creating AI provider instances.

// All utilities auto-imported - no import statement needed
export default defineEventHandler(async (event) => {
  const ai = createAIProvider(event)

  // Use OpenAI
  const openai = ai.openai()

  // Use Anthropic
  const anthropic = ai.anthropic()

  // Auto-detect from model name
  const model = ai.model('gpt-4o')  // Returns OpenAI
  const model2 = ai.model('claude-sonnet-4-20250514')  // Returns Anthropic
})

Available Providers

// AI_PROVIDERS and helpers are auto-imported
export default defineEventHandler(async (event) => {
  // Get all providers
  console.log(AI_PROVIDERS.openai.models)

  // Get configured providers (with API keys)
  const available = getAvailableProviders(useRuntimeConfig())
})

Generating Collections for Persistence

The package includes a schema for generating a chat conversations collection using the crouton collection generator.

Option 1: CLI Command

# Generate in a new 'ai' layer
pnpm crouton ai chatConversations --fields-file=node_modules/@fyit/crouton-ai/schemas/chat-conversations.json

Option 2: Config File

// crouton.config.js
export default {
  collections: [
    {
      name: 'chatConversations',
      fieldsFile: 'node_modules/@fyit/crouton-ai/schemas/chat-conversations.json'
    }
  ],
  targets: [
    { layer: 'ai', collections: ['chatConversations'] }
  ],
  dialect: 'sqlite'
}

Then run:

pnpm crouton config ./crouton.config.js

Schema Fields

The generated collection includes:

| Field | Type | Description | |-------|------|-------------| | title | string | Optional conversation title | | messages | json | Array of chat messages | | provider | string | AI provider (openai, anthropic) | | model | string | Model identifier | | systemPrompt | text | System prompt used | | metadata | json | Additional metadata | | messageCount | number | Cached message count | | lastMessageAt | date | Last message timestamp |

Plus auto-generated fields: id, teamId, userId, createdAt, updatedAt, createdBy, updatedBy.

Using with Persistence

After generating the collection:

<script setup lang="ts">
const route = useRoute()
const conversationId = route.params.id as string

// Load existing conversation
const { data: conversation } = await useFetch(`/api/ai-chatConversations/${conversationId}`)

const { messages, importMessages, exportMessages } = useChat()

// Load messages on mount
onMounted(() => {
  if (conversation.value?.messages) {
    importMessages(conversation.value.messages)
  }
})

// Auto-save on changes
watchDebounced(messages, async () => {
  await $fetch(`/api/ai-chatConversations/${conversationId}`, {
    method: 'PATCH',
    body: {
      messages: exportMessages(),
      messageCount: messages.value.length,
      lastMessageAt: new Date().toISOString()
    }
  })
}, { debounce: 1000 })
</script>

<template>
  <AIChatbox class="h-full" />
</template>

TypeScript Types

Import types from the schema:

import type {
  ChatConversation,
  NewChatConversation,
  ChatMessage
} from '@fyit/crouton-ai/schemas/chat-conversations'

Examples

Basic Chat Page

<!-- pages/chat.vue -->
<template>
  <div class="container mx-auto p-4">
    <h1 class="text-2xl font-bold mb-4">AI Assistant</h1>
    <AIChatbox class="h-[600px]" />
  </div>
</template>

Chat with Model Selection

<script setup lang="ts">
const selectedModel = ref('gpt-4o')
const { providers } = useAIProvider()

const { messages, input, handleSubmit, isLoading } = useChat({
  api: '/api/ai/chat',
  model: selectedModel.value
})
</script>

<template>
  <div class="space-y-4">
    <USelect
      v-model="selectedModel"
      :options="providers.flatMap(p => p.models)"
    />
    <AIChatbox class="h-96" />
  </div>
</template>

Custom Message Rendering

<script setup lang="ts">
const { messages, input, handleSubmit, isLoading } = useChat()
</script>

<template>
  <div class="flex flex-col h-96">
    <div class="flex-1 overflow-y-auto p-4 space-y-4">
      <div
        v-for="message in messages"
        :key="message.id"
        :class="message.role === 'user' ? 'text-right' : 'text-left'"
      >
        <div
          class="inline-block px-4 py-2 rounded-lg"
          :class="message.role === 'user'
            ? 'bg-primary-500 text-white'
            : 'bg-gray-100 dark:bg-gray-800'"
        >
          {{ message.content }}
        </div>
      </div>
    </div>

    <AIInput
      v-model="input"
      :loading="isLoading"
      @submit="handleSubmit"
    />
  </div>
</template>

License

MIT