@testifyltd/ui
v1.1.6
Published
A beautiful and modern UI component library for AI chat interfaces.
Readme
@testifyltd/ui
A beautiful and modern UI component library for AI chat interfaces.
Installation
npm install @testifyltd/uiUsage
Chat Component
The Chat component provides a floating chat interface that can be easily integrated into any React application.
import Chat from '@testifyltd/ui/chat';
import '@testifyltd/ui/styles.css';
import { useState } from 'react';
export default function Page() {
const [isChatOpen, setIsChatOpen] = useState(false);
const handleChatClose = (open: boolean) => {
setIsChatOpen(open);
};
return (
<main>
<button onClick={() => setIsChatOpen(true)}>Open Chat</button>
<Chat
apiKey="your-openai-api-key"
organizationId="your-org-id"
projectId="your-project-id"
assistantId="your-assistant-id"
theme="light"
position="bottom-right"
isOpen={isChatOpen}
onClose={handleChatClose}
/>
</main>
);
}Props
| Prop | Type | Default | Description | |------|------|---------|-------------| | apiKey | string | undefined | Your OpenAI API key | | organizationId | string | undefined | Your OpenAI organization ID | | projectId | string | undefined | Your OpenAI project ID | | assistantId | string | undefined | Your OpenAI assistant ID | | theme | 'light' | 'dark' | 'light' | The theme of the chat interface | | position | 'bottom-right' | 'bottom-left' | 'top-right' | 'top-left' | 'bottom-right' | Position of the chat widget | | isOpen | boolean | false | Controls the visibility of the chat panel | | onClose | (open: boolean) => void | undefined | Callback function called when the chat panel is closed |
API Implementation
To use the chat component, you need to set up an API route in your Next.js application. Here's how to implement it:
// app/api/ai-assistant/route.ts
import { AssistantResponse } from 'ai';
import { NextResponse } from 'next/server';
import OpenAI from 'openai';
export const runtime = 'edge';
export async function POST(req: Request) {
try {
// Get configuration from headers
const apiKey = req.headers.get('X-OpenAI-Key') || process.env.OPENAI_API_KEY || '';
const organizationId = req.headers.get('X-OpenAI-Organization') || process.env.OPENAI_ORGANIZATION_ID || '';
const projectId = req.headers.get('X-OpenAI-Project') || process.env.OPENAI_PROJECT_ID || '';
const assistantId = req.headers.get('X-OpenAI-Assistant') || process.env.OPENAI_ASSISTANT_ID || '';
const openai = new OpenAI({
apiKey,
organization: organizationId,
project: projectId,
});
const { threadId, message } = await req.json();
// Create or use existing thread
const thread = threadId ? threadId : (await openai.beta.threads.create({})).id;
// Add user message
const createdMessage = await openai.beta.threads.messages.create(thread, {
role: 'user',
content: message,
});
// Stream the response
return AssistantResponse(
{ threadId: thread, messageId: createdMessage.id },
async ({ forwardStream }) => {
const runStream = await openai.beta.threads.runs.stream(thread, {
assistant_id: assistantId,
});
const run = await forwardStream(runStream);
return run;
}
);
} catch (error) {
console.error('Error in chat route:', error);
return NextResponse.json({ error: 'Internal server error' }, { status: 500 });
}
}Environment Variables
For security, it's recommended to use environment variables for sensitive information:
OPENAI_API_KEY=your-api-key
OPENAI_ORGANIZATION_ID=your-org-id
OPENAI_PROJECT_ID=your-project-id
OPENAI_ASSISTANT_ID=your-assistant-idFeatures
- 🎨 Beautiful and modern UI
- 🌓 Light and dark themes
- 📱 Responsive design
- 🔄 Real-time streaming responses
- 💬 Persistent chat threads
- 📍 Flexible positioning
- 🔒 Secure API key handling
License
MIT
