@default_manager_/ai-chatbot-sdk
v1.3.0
Published
A reusable AI chatbot SDK for React and Next.js applications
Maintainers
Readme
AI Chatbot SDK
A reusable AI chatbot SDK for React and Next.js applications. Easy to integrate, highly customizable, and supports streaming responses.
Installation
npm install @your-org/ai-chatbot-sdk
# or
yarn add @your-org/ai-chatbot-sdk
# or
pnpm add @your-org/ai-chatbot-sdkQuick Start
React Application
import { Chat } from '@your-org/ai-chatbot-sdk';
import '@your-org/ai-chatbot-sdk/dist/styles.css'; // If you have styles
function App() {
return (
<div style={{ height: '100vh' }}>
<Chat apiUrl="http://localhost:8000" />
</div>
);
}Next.js Application
'use client';
import { Chat } from '@your-org/ai-chatbot-sdk';
export default function ChatPage() {
return (
<div style={{ height: '100vh' }}>
<Chat apiUrl={process.env.NEXT_PUBLIC_API_URL} />
</div>
);
}API Reference
<Chat /> Component
Main chat component with all features built-in.
Props
| Prop | Type | Default | Description |
|------|------|---------|-------------|
| apiUrl | string | 'http://localhost:8000' | Backend API URL |
| model | string | null | AI model to use (optional) |
| temperature | number | 0.7 | Sampling temperature (0-2) |
| maxTokens | number | 1000 | Maximum tokens to generate |
| initialMessage | string | "Hello! I'm your AI assistant..." | Initial assistant message |
| placeholder | string | "Type your message..." | Input placeholder text |
| showTimestamps | boolean | true | Whether to show message timestamps |
| streaming | boolean | true | Whether to use streaming responses |
| onMessage | function | undefined | Callback when a message is sent |
| onResponse | function | undefined | Callback when AI responds |
| onError | function | undefined | Callback on error |
| className | string | '' | Additional CSS classes |
| style | object | {} | Inline styles |
| initialMessages | array | null | Initial messages array |
Example
<Chat
apiUrl="http://localhost:8000"
model="llama-3.1-8b-instant"
temperature={0.8}
maxTokens={2000}
streaming={true}
onMessage={(message) => console.log('Sent:', message)}
onResponse={(response) => console.log('Received:', response)}
onError={(error) => console.error('Error:', error)}
/>useChat Hook
Custom hook for building your own chat UI.
Parameters
const {
messages,
isLoading,
error,
sendMessage,
clearMessages,
messagesEndRef,
} = useChat({
apiUrl: 'http://localhost:8000',
model: null,
temperature: 0.7,
maxTokens: 1000,
initialMessage: "Hello! I'm your AI assistant...",
streaming: true,
initialMessages: null,
});Returns
| Property | Type | Description |
|----------|------|-------------|
| messages | array | Array of message objects |
| isLoading | boolean | Whether a request is in progress |
| error | string \| null | Error message if any |
| sendMessage | function | Function to send a message |
| clearMessages | function | Function to clear all messages |
| messagesEndRef | ref | Ref for auto-scrolling |
Example
import { useChat } from '@your-org/ai-chatbot-sdk';
function CustomChat() {
const { messages, isLoading, sendMessage } = useChat({
apiUrl: 'http://localhost:8000',
});
return (
<div>
{messages.map((msg) => (
<div key={msg.id}>{msg.content}</div>
))}
<button onClick={() => sendMessage('Hello!')} disabled={isLoading}>
Send
</button>
</div>
);
}API Services
You can also use the API services directly:
import { sendChatMessage, streamChatMessage, getModels } from '@your-org/ai-chatbot-sdk';
// Send a message (non-streaming)
const response = await sendChatMessage(
[{ role: 'user', content: 'Hello!' }],
'llama-3.1-8b-instant',
'http://localhost:8000',
{ temperature: 0.7, max_tokens: 1000 }
);
// Stream a message
await streamChatMessage(
[{ role: 'user', content: 'Hello!' }],
(chunk) => console.log('Chunk:', chunk),
() => console.log('Complete'),
(error) => console.error('Error:', error),
'llama-3.1-8b-instant',
'http://localhost:8000'
);
// Get available models
const models = await getModels('http://localhost:8000');Styling
The SDK uses Tailwind CSS classes. Make sure you have Tailwind CSS configured in your project, or import the styles if provided.
With Tailwind CSS
If your project uses Tailwind CSS, the styles will work automatically.
Without Tailwind CSS
You can customize the components by passing className and style props, or build your own UI using the useChat hook.
Examples
Full Screen Chat
<div style={{ height: '100vh', width: '100vw' }}>
<Chat apiUrl="http://localhost:8000" />
</div>Embedded Chat
<div style={{ height: '600px', width: '400px' }}>
<Chat apiUrl="http://localhost:8000" />
</div>Custom Styled Chat
<Chat
apiUrl="http://localhost:8000"
className="my-custom-chat"
style={{ borderRadius: '12px' }}
/>Requirements
- React 18+
- Backend API compatible with the expected endpoints:
POST /api/chat/message- Non-streaming messagesPOST /api/chat/stream- Streaming messagesGET /api/chat/models- Available models
License
MIT
