@llms-sdk/chat
v2.2.0
Published
Chat application using LLMS SDK
Readme
llms-sdk-chat
Interactive chat application demonstrating llms-sdk and llms-sdk-terminal integration with auto-generated CLI options.
Features
- Interactive Terminal UI: Built with llms-sdk-terminal differential rendering for smooth performance
- Multi-provider Support: Anthropic Claude, OpenAI, and Google Gemini through unified llms-sdk interface
- Auto-generated CLI: Command-line options automatically generated from TypeScript configuration interfaces
- Persistent Defaults: Save your preferred settings and use them automatically
- Image Support: Analyze images with multimodal models
- Streaming & Thinking: Real-time streaming with visual thinking output for supported models
- Context Management: Persistent conversation context across the session
Quick Start
# Set API keys
export ANTHROPIC_API_KEY="sk-..."
export OPENAI_API_KEY="sk-..."
export GOOGLE_API_KEY="..."
# Start chatting (no installation required)
npx @llms-sdk/chat
# Or set your defaults first
npx @llms-sdk/chat defaults anthropic -m claude-3-5-sonnet-20241022
npx @llms-sdk/chatCommands
Interactive Chat Mode
Start an interactive chat session with persistent context:
# Using saved defaults
npx @llms-sdk/chat
# With specific provider and model
npx @llms-sdk/chat -p anthropic -m claude-3-5-sonnet-20241022
npx @llms-sdk/chat -p openai -m gpt-4o
npx @llms-sdk/chat -p google -m gemini-1.5-proFeatures:
- Persistent conversation context
- Real-time streaming output
- Visual thinking display (gray text) for supported models
- Token usage and cost tracking
- Professional terminal interface with proper input handling
Defaults System
Save and manage your preferred settings:
# Set defaults (saves to ~/.llms-sdk-chat/defaults.json)
npx @llms-sdk/chat defaults anthropic -m claude-3-5-sonnet-20241022 --thinkingEnabled
npx @llms-sdk/chat defaults openai -m gpt-4o --reasoningEffort medium
# Show current defaults
npx @llms-sdk/chat defaults --show
# Clear saved defaults
npx @llms-sdk/chat defaults --clear
# Use defaults automatically
npx @llms-sdk/chat # Interactive mode with defaults
npx @llms-sdk/chat "Hello world" # One-shot with defaultsModel Discovery
Explore available models with filtering:
# List all models
npx @llms-sdk/chat models
# Filter by provider
npx @llms-sdk/chat models --provider anthropic
npx @llms-sdk/chat models --provider openai
# Filter by capabilities
npx @llms-sdk/chat models --tools # Function calling support
npx @llms-sdk/chat models --images # Image input support
npx @llms-sdk/chat models --cheap # Cost-effective options
# JSON output for scripting
npx @llms-sdk/chat models --jsonOne-Shot Messages
Send single messages without persistent context:
# With specific provider/model
npx @llms-sdk/chat anthropic -m claude-3-5-sonnet-20241022 "Explain quantum computing"
npx @llms-sdk/chat openai -m gpt-4o "Write a poem about TypeScript"
# Using saved defaults
npx @llms-sdk/chat "What's the capital of France?"Image Analysis
Analyze images with multimodal models:
# Single image
npx @llms-sdk/chat -i screenshot.png "What's in this image?"
npx @llms-sdk/chat anthropic -m claude-3-5-sonnet-20241022 -i diagram.jpg "Explain this flowchart"
# Multiple images
npx @llms-sdk/chat --images before.jpg after.jpg "What changed between these images?"Supported formats: .jpg, .jpeg, .png, .gif, .webp, .bmp, .tiff, .tif
Advanced Features
Thinking/Reasoning Output
Models that support internal reasoning display their thought process:
# Anthropic thinking mode
npx @llms-sdk/chat anthropic -m claude-3-5-sonnet-20241022 --thinkingEnabled "Solve this complex problem"
# OpenAI reasoning models (automatic)
npx @llms-sdk/chat openai -m o1-mini --reasoningEffort high "Analyze this data"Thinking appears in gray text, separated from the final response.
Provider-Specific Options
Each provider exposes its unique capabilities:
# Anthropic options
npx @llms-sdk/chat anthropic --help # Shows thinkingEnabled, betaFeatures, etc.
# OpenAI options
npx @llms-sdk/chat openai --help # Shows reasoningEffort, organization, etc.
# Google options
npx @llms-sdk/chat google --help # Shows projectId, includeThoughts, etc.Architecture
This application demonstrates several key concepts:
Auto-Generated CLI
- TypeScript interfaces automatically become command-line options
- Type validation, help text, and enum choices generated from code
- Perfect sync between TypeScript types and CLI arguments
TUI Integration
- Uses llms-sdk-terminal's differential rendering for smooth performance
- Container-based component architecture
- Proper focus management and keyboard handling
Provider Abstraction
- Single unified interface across all LLM providers
- Context serialization and conversation management
- Streaming support with provider-specific optimizations
Development
# Development mode
npm run dev
# Build
npm run build
# Type checking
npm run typecheck
# Test with simulation
npx tsx --no-deprecation src/index.ts chat --simulate-input "Hello world" "ENTER"Testing Commands:
Test model switching: npx tsx --no-deprecation src/index.ts chat --simulate-input "/" "model" "SPACE" "gpt-4o" "ENTER"
Special input keywords: "TAB", "ENTER", "SPACE", "ESC"
Examples
Quick Setup Workflow
# 1. Set your preferred provider and model
npx @llms-sdk/chat defaults anthropic -m claude-3-5-sonnet-20241022
# 2. Start using without repetitive options
npx @llms-sdk/chat "Hello!"
npx @llms-sdk/chat
npx @llms-sdk/chat -i image.png "Describe this"Model Discovery Workflow
# Find cost-effective models with image support
npx @llms-sdk/chat models --images --cheap
# Get detailed Anthropic model information
npx @llms-sdk/chat models --provider anthropic
# Export model data for scripts
npx @llms-sdk/chat models --json > models.jsonAdvanced Chat Session
# Start with thinking enabled and custom limits
npx @llms-sdk/chat -p anthropic -m claude-3-5-sonnet-20241022 --thinkingEnabled --maxOutputTokens 2000
# The session will show:
# - Gray thinking text when the model reasons internally
# - Token usage per message
# - Running conversation cost
# - Smooth terminal interface with proper line handlingThis application serves as both a practical chat tool and a demonstration of building sophisticated CLI applications with TypeScript, unified LLM interfaces, and modern terminal UIs.
