npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@llms-sdk/chat

v2.2.0

Published

Chat application using LLMS SDK

Readme

llms-sdk-chat

Interactive chat application demonstrating llms-sdk and llms-sdk-terminal integration with auto-generated CLI options.

Features

  • Interactive Terminal UI: Built with llms-sdk-terminal differential rendering for smooth performance
  • Multi-provider Support: Anthropic Claude, OpenAI, and Google Gemini through unified llms-sdk interface
  • Auto-generated CLI: Command-line options automatically generated from TypeScript configuration interfaces
  • Persistent Defaults: Save your preferred settings and use them automatically
  • Image Support: Analyze images with multimodal models
  • Streaming & Thinking: Real-time streaming with visual thinking output for supported models
  • Context Management: Persistent conversation context across the session

Quick Start

# Set API keys
export ANTHROPIC_API_KEY="sk-..."
export OPENAI_API_KEY="sk-..."
export GOOGLE_API_KEY="..."

# Start chatting (no installation required)
npx @llms-sdk/chat

# Or set your defaults first
npx @llms-sdk/chat defaults anthropic -m claude-3-5-sonnet-20241022
npx @llms-sdk/chat

Commands

Interactive Chat Mode

Start an interactive chat session with persistent context:

# Using saved defaults
npx @llms-sdk/chat

# With specific provider and model
npx @llms-sdk/chat -p anthropic -m claude-3-5-sonnet-20241022
npx @llms-sdk/chat -p openai -m gpt-4o
npx @llms-sdk/chat -p google -m gemini-1.5-pro

Features:

  • Persistent conversation context
  • Real-time streaming output
  • Visual thinking display (gray text) for supported models
  • Token usage and cost tracking
  • Professional terminal interface with proper input handling

Defaults System

Save and manage your preferred settings:

# Set defaults (saves to ~/.llms-sdk-chat/defaults.json)
npx @llms-sdk/chat defaults anthropic -m claude-3-5-sonnet-20241022 --thinkingEnabled
npx @llms-sdk/chat defaults openai -m gpt-4o --reasoningEffort medium

# Show current defaults
npx @llms-sdk/chat defaults --show

# Clear saved defaults
npx @llms-sdk/chat defaults --clear

# Use defaults automatically
npx @llms-sdk/chat                         # Interactive mode with defaults
npx @llms-sdk/chat "Hello world"           # One-shot with defaults

Model Discovery

Explore available models with filtering:

# List all models
npx @llms-sdk/chat models

# Filter by provider
npx @llms-sdk/chat models --provider anthropic
npx @llms-sdk/chat models --provider openai

# Filter by capabilities
npx @llms-sdk/chat models --tools               # Function calling support
npx @llms-sdk/chat models --images              # Image input support
npx @llms-sdk/chat models --cheap               # Cost-effective options

# JSON output for scripting
npx @llms-sdk/chat models --json

One-Shot Messages

Send single messages without persistent context:

# With specific provider/model
npx @llms-sdk/chat anthropic -m claude-3-5-sonnet-20241022 "Explain quantum computing"
npx @llms-sdk/chat openai -m gpt-4o "Write a poem about TypeScript"

# Using saved defaults
npx @llms-sdk/chat "What's the capital of France?"

Image Analysis

Analyze images with multimodal models:

# Single image
npx @llms-sdk/chat -i screenshot.png "What's in this image?"
npx @llms-sdk/chat anthropic -m claude-3-5-sonnet-20241022 -i diagram.jpg "Explain this flowchart"

# Multiple images
npx @llms-sdk/chat --images before.jpg after.jpg "What changed between these images?"

Supported formats: .jpg, .jpeg, .png, .gif, .webp, .bmp, .tiff, .tif

Advanced Features

Thinking/Reasoning Output

Models that support internal reasoning display their thought process:

# Anthropic thinking mode
npx @llms-sdk/chat anthropic -m claude-3-5-sonnet-20241022 --thinkingEnabled "Solve this complex problem"

# OpenAI reasoning models (automatic)
npx @llms-sdk/chat openai -m o1-mini --reasoningEffort high "Analyze this data"

Thinking appears in gray text, separated from the final response.

Provider-Specific Options

Each provider exposes its unique capabilities:

# Anthropic options
npx @llms-sdk/chat anthropic --help    # Shows thinkingEnabled, betaFeatures, etc.

# OpenAI options
npx @llms-sdk/chat openai --help       # Shows reasoningEffort, organization, etc.

# Google options
npx @llms-sdk/chat google --help       # Shows projectId, includeThoughts, etc.

Architecture

This application demonstrates several key concepts:

Auto-Generated CLI

  • TypeScript interfaces automatically become command-line options
  • Type validation, help text, and enum choices generated from code
  • Perfect sync between TypeScript types and CLI arguments

TUI Integration

  • Uses llms-sdk-terminal's differential rendering for smooth performance
  • Container-based component architecture
  • Proper focus management and keyboard handling

Provider Abstraction

  • Single unified interface across all LLM providers
  • Context serialization and conversation management
  • Streaming support with provider-specific optimizations

Development

# Development mode
npm run dev

# Build
npm run build

# Type checking
npm run typecheck

# Test with simulation
npx tsx --no-deprecation src/index.ts chat --simulate-input "Hello world" "ENTER"

Testing Commands: Test model switching: npx tsx --no-deprecation src/index.ts chat --simulate-input "/" "model" "SPACE" "gpt-4o" "ENTER"

Special input keywords: "TAB", "ENTER", "SPACE", "ESC"

Examples

Quick Setup Workflow

# 1. Set your preferred provider and model
npx @llms-sdk/chat defaults anthropic -m claude-3-5-sonnet-20241022

# 2. Start using without repetitive options
npx @llms-sdk/chat "Hello!"
npx @llms-sdk/chat
npx @llms-sdk/chat -i image.png "Describe this"

Model Discovery Workflow

# Find cost-effective models with image support
npx @llms-sdk/chat models --images --cheap

# Get detailed Anthropic model information
npx @llms-sdk/chat models --provider anthropic

# Export model data for scripts
npx @llms-sdk/chat models --json > models.json

Advanced Chat Session

# Start with thinking enabled and custom limits
npx @llms-sdk/chat -p anthropic -m claude-3-5-sonnet-20241022 --thinkingEnabled --maxOutputTokens 2000

# The session will show:
# - Gray thinking text when the model reasons internally
# - Token usage per message
# - Running conversation cost
# - Smooth terminal interface with proper line handling

This application serves as both a practical chat tool and a demonstration of building sophisticated CLI applications with TypeScript, unified LLM interfaces, and modern terminal UIs.