techsurf-chat-agent-sdk
v1.0.1
Published
TechSurf 2025 Chat Agent Platform SDK - Build intelligent content-aware chatbots with Contentstack integration
Maintainers
Readme
@techsurf/chat-agent-sdk
TechSurf 2025 Chat Agent Platform SDK - Build intelligent content-aware chatbots with Contentstack integration in minutes.
🚀 Quick Start
Installation
npm install @techsurf/chat-agent-sdkCreate a new chat agent project
npx create-chat-agent my-travel-bot --template travel
cd my-travel-bot
npm install
npm run dev📋 Available Templates
travel- Travel & Tourism chatbot (Italy tours, destinations)ecommerce- E-commerce chatbot (Products, recommendations)docs- Documentation chatbot (Help, guides)custom- Custom blank starter template
🎯 Features
- 🔌 Plug & Play: Get a working chatbot in under 10 minutes
- 🧠 Content-Aware: Intelligent integration with Contentstack CMS
- ⚡ Multi-LLM Support: Groq, OpenAI, Anthropic, Google models
- 📱 React-First: Modern hooks-based API with TypeScript
- 🌊 Real-time Streaming: Live response streaming with backpressure management
- 🔍 Semantic Search: Natural language to content mapping
- 🛡️ Error Resilient: Graceful degradation and fallback handling
💻 Usage Examples
Basic Chat Hook
import { useChat } from '@techsurf/chat-agent-sdk';
function BasicChatbot() {
const { messages, sendMessage, isLoading } = useChat({
modelProvider: 'groq',
apiEndpoint: '/api/chat'
});
return (
<div>
{messages.map(message => (
<div key={message.id}>{message.content}</div>
))}
<button onClick={() => sendMessage('Hello!')}>
Send Message
</button>
</div>
);
}Content-Aware Travel Chatbot
import { useContentChat, TravelChatbot } from '@techsurf/chat-agent-sdk';
function TravelApp() {
return (
<TravelChatbot
contentstack={{
apiKey: process.env.NEXT_PUBLIC_CONTENTSTACK_API_KEY!,
deliveryToken: process.env.NEXT_PUBLIC_CONTENTSTACK_DELIVERY_TOKEN!,
environment: process.env.NEXT_PUBLIC_CONTENTSTACK_ENVIRONMENT!
}}
suggestedQueries={[
"What tours are available for Italy?",
"Show me romantic getaways in Tuscany"
]}
/>
);
}Advanced Content-Aware Chat
import { useContentChat } from '@techsurf/chat-agent-sdk';
function CustomChatbot() {
const { messages, sendMessage, queryContent } = useContentChat({
contentTypes: ['tours', 'destinations', 'packages'],
fallbackBehavior: 'graceful',
modelProvider: 'groq',
contentstack: {
apiKey: 'your-api-key',
deliveryToken: 'your-delivery-token',
environment: 'development'
}
});
// Direct content queries
const handleSpecialQuery = async () => {
const tours = await queryContent('luxury tours in Rome', ['tours']);
console.log('Found tours:', tours);
};
return (
<div>
{/* Your custom chat interface */}
</div>
);
}🔧 Configuration
Environment Variables
# Contentstack Configuration
NEXT_PUBLIC_CONTENTSTACK_API_KEY=your_api_key
NEXT_PUBLIC_CONTENTSTACK_DELIVERY_TOKEN=your_delivery_token
NEXT_PUBLIC_CONTENTSTACK_ENVIRONMENT=development
# LLM Provider (choose one)
GROQ_API_KEY=gsk_your_groq_key
OPENAI_API_KEY=sk-your_openai_key
ANTHROPIC_API_KEY=sk-ant-your_anthropic_key
GOOGLE_API_KEY=your_google_key
# MCP Server (optional)
MCP_SERVER_URL=http://localhost:3001API Route Setup
Create /app/api/chat/route.ts:
import { NextRequest, NextResponse } from 'next/server';
export async function POST(request: NextRequest) {
const { messages, options, contentstack, query } = await request.json();
// Your chat logic with Contentstack integration
// See templates for complete examples
return NextResponse.json({
content: "Your AI response here",
sources: [], // Contentstack sources
confidence: 0.9,
processingTime: 150
});
}🎨 Components
ChatInterface
Pre-built responsive chat interface with streaming support:
import { ChatInterface } from '@techsurf/chat-agent-sdk';
<ChatInterface
messages={messages}
onSendMessage={sendMessage}
isLoading={isLoading}
error={error}
placeholder="Ask me anything..."
suggestedQueries={["Query 1", "Query 2"]}
/>TravelChatbot
Specialized travel chatbot with Italian tourism focus:
import { TravelChatbot } from '@techsurf/chat-agent-sdk';
<TravelChatbot
contentstack={config}
title="My Travel Assistant"
className="max-w-4xl"
/>🔌 CLI Commands
# Create new project
npx create-chat-agent [name] --template [travel|ecommerce|docs|custom]
# List available templates
npx create-chat-agent templates
# Interactive setup
npx create-chat-agent🌟 Key Differentiators
Content Intelligence
- Semantic Understanding: Maps "What tours are available for Italy?" to actual Contentstack tour content
- Real-time Sync: Live content updates with webhook integration
- Multi-locale Support: Intelligent content fallbacks for international apps
Developer Experience
- Zero-Config: Working chatbot in 30 seconds with sensible defaults
- TypeScript-First: Full type safety with generic content type support
- React 19 Ready: Concurrent features and Suspense integration
Enterprise Features
- Multi-Provider: Switch between LLM providers based on query type
- Rate Limiting: Built-in provider rate limiting with graceful degradation
- Error Boundaries: Comprehensive error handling with recovery mechanisms
- Analytics Ready: Built-in performance monitoring and usage tracking
🏗️ Architecture
Chat Agent Platform
├── 🎯 LLM Model API
│ ├── Multi-provider abstraction (Groq, OpenAI, Anthropic, Google)
│ ├── Streaming response handling
│ └── Rate limiting & fallbacks
├── 📝 Contentstack Integration
│ ├── MCP (Model Context Protocol) client
│ ├── Semantic content mapping
│ └── Real-time content sync
└── ⚛️ React SDK
├── useChat hook (basic LLM chat)
├── useContentChat hook (content-aware)
└── Pre-built components🛠️ API Reference
Hooks
useChat(options: ChatOptions)
Basic chat functionality with any LLM provider.
Options:
apiEndpoint?: string- Chat API endpoint (default: '/api/chat')modelProvider?: LLMProvider- LLM provider (default: 'groq')streaming?: boolean- Enable response streaming (default: true)maxTokens?: number- Maximum response tokens (default: 2000)temperature?: number- Response creativity (default: 0.7)systemPrompt?: string- System instructions
Returns:
messages: ChatMessage[]- Chat historysendMessage: (content: string) => Promise<void>- Send messageisLoading: boolean- Request in progresserror: Error | null- Last errorclearMessages: () => void- Clear chat historycancelRequest: () => void- Cancel ongoing request
useContentChat(options: ContentChatOptions)
Content-aware chat with Contentstack integration.
Additional Options:
contentTypes?: string[]- Content types to search (default: ['tours', 'destinations', 'packages'])searchScope?: 'all' | 'published' | 'draft'- Content scope (default: 'published')fallbackBehavior?: 'graceful' | 'strict'- Error handling (default: 'graceful')contentstack: ContentstackConfig- Contentstack configuration (required)
Additional Returns:
queryContent: (query: string, contentTypes?: string[]) => Promise<ContentEntry[]>- Direct content queries
Types
interface ChatMessage {
id: string;
content: string;
role: 'user' | 'assistant' | 'system';
timestamp: Date;
metadata?: {
sources?: ContentSource[];
confidence?: number;
processingTime?: number;
};
}
interface ContentstackConfig {
apiKey: string;
deliveryToken: string;
environment: string;
region?: 'us' | 'eu';
branch?: string;
}
type LLMProvider = 'groq' | 'openai' | 'anthropic' | 'google';🤝 Contributing
Built for TechSurf 2025 - Contentstack's hackathon for developer innovation.
Development
git clone https://github.com/techsurf25/chat-agent-sdk
cd chat-agent-sdk/packages/chat-agent-sdk
npm install
npm run devTesting
npm test📄 License
MIT License - Built for TechSurf 2025
🏆 TechSurf 2025
This SDK is part of our submission for TechSurf 2025, demonstrating:
- Platform Thinking: Complete developer ecosystem, not just a point solution
- Content Intelligence: Beyond keyword matching to semantic understanding
- Enterprise Ready: Multi-tenant architecture with monitoring and analytics
- Developer First: Superior DX as primary competitive advantage
The "Aha!" Moment: When developers ask "What tours are available for Italy?" and get real Contentstack content with intelligent recommendations - that's the magic of content-aware AI.
Built by the TechSurf 2025 Team • Competition Details • Contentstack Platform
