@revealbi/api
v1.0.5
Published
A TypeScript/JavaScript client library for the Reveal SDK AI APIs. Provides a clean, type-safe interface for AI-powered insights, chat, and dashboard generation with real-time streaming support.
Readme
@revealbi/api
A TypeScript/JavaScript client library for the Reveal SDK AI APIs. Provides a clean, type-safe interface for AI-powered insights, chat, and dashboard generation with real-time streaming support.
Features
- Type-Safe: Full TypeScript support with comprehensive type definitions
- AI-Powered: Insights, chat, and intelligent dashboard generation
- Real-time Streaming: SSE support with three consumption patterns (for-await, event listeners, aggregated result)
- Universal: Works in both browser and Node.js environments
- Tree-Shakeable: ESM modules for optimal bundle size
Installation
npm install @revealbi/apiOr use the CDN:
<script src="https://cdn.jsdelivr.net/npm/@revealbi/api/dist/index.umd.js"></script>Quick Start
import { RevealSdkClient } from '@revealbi/api';
// Initialize once at app startup (e.g., in main.ts or App.tsx)
RevealSdkClient.initialize({
hostUrl: 'https://your-api-server.com'
});
// Later, anywhere in your app:
const client = RevealSdkClient.getInstance();
// Get insights for a dashboard
const insight = await client.ai.insights.get({
dashboardId: 'my-dashboard',
type: 'summary',
});
console.log(insight.explanation);API Structure
The client is organized by API namespaces:
client.ai.insights- AI-powered dashboard and visualization insightsclient.ai.chat- Conversational AI for data exploration and dashboard generation
Initialization
import { RevealSdkClient } from '@revealbi/api';
// Initialize once at app startup
RevealSdkClient.initialize({
hostUrl: 'https://your-server.com'
});
// Later, anywhere in your app:
const client = RevealSdkClient.getInstance();UMD (Script Tag)
When loaded via CDN, all exports are available under the rv namespace:
<script src="https://cdn.jsdelivr.net/npm/@revealbi/api/dist/index.umd.js"></script>
<script>
rv.RevealSdkClient.initialize({
hostUrl: 'https://your-server.com'
});
const client = rv.RevealSdkClient.getInstance();
</script>AI Insights API
Access via client.ai.insights
Generate AI-powered insights for dashboards and visualizations. Three insight types are available: 'summary', 'analysis', and 'forecast'.
Non-Streaming (Default)
Returns Promise<InsightResponse>.
const client = RevealSdkClient.getInstance();
// Get summary for entire dashboard
const insight = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
});
console.log(insight.explanation);Streaming
Add stream: true to the request to get an AIStream that yields events as they arrive.
Pattern 1: for-await (Full Control)
const stream = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
stream: true,
});
for await (const event of stream) {
switch (event.type) {
case 'progress': console.log('Status:', event.message); break;
case 'text': document.getElementById('output').textContent += event.content; break;
case 'error': console.error('Error:', event.error); break;
}
}Pattern 2: Event Listeners (Simple UI Wiring)
const stream = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
stream: true,
});
stream.on('progress', (message) => console.log('Status:', message));
stream.on('text', (content) => document.getElementById('output').textContent += content);
stream.on('error', (error) => console.error('Error:', error));
const result = await stream.finalResponse();
console.log('Complete:', result.explanation);Pattern 3: Aggregated Result from Stream
const stream = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
stream: true,
});
const result = await stream.finalResponse();
console.log(result.explanation);Using Dashboard Objects
Pass a dashboard object directly instead of a dashboard ID:
const insight = await client.ai.insights.get({
dashboard: revealView.dashboard, // RVDashboard object
type: 'analysis',
});Visualization-Level Insights
const insight = await client.ai.insights.get({
dashboard: revealView.dashboard,
visualizationId: 'sales-by-region-chart',
type: 'summary',
});Forecast
const insight = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'forecast',
forecastPeriods: 12, // Default: 6
});Specify LLM Model
const insight = await client.ai.insights.get({
dashboardId: 'my-dashboard',
type: 'summary',
model: 'anthropic',
});Request Parameters
interface InsightRequest {
dashboard?: string | RVDashboard; // Dashboard object or JSON string
dashboardId?: string; // Dashboard ID
visualizationId?: string; // Visualization ID for visualization-level insights
type: 'summary' | 'analysis' | 'forecast'; // Required
forecastPeriods?: number; // Forecast periods (default: 6)
model?: string; // Override LLM model
signal?: AbortSignal; // For request cancellation
stream?: boolean; // Enable streaming (default: false)
}Response Types
interface InsightResponse {
explanation: string; // AI-generated explanation
}When stream: true, the return type is AIStream<InsightResponse>:
| Method / Pattern | Description |
|---------|-------------|
| for await (const event of stream) | Iterate over events as they arrive |
| .on(event, handler) | Register event-specific listeners |
| .finalResponse() | Returns a promise that resolves with the complete InsightResponse |
| .abort() | Cancel the stream |
Chat API
Access via client.ai.chat
Conversational AI interface for exploring data, generating dashboards, and modifying existing dashboards through natural language.
Non-Streaming (Default)
Returns Promise<ChatResponse>.
const response = await client.ai.chat.sendMessage({
message: 'Show me sales trends for last quarter',
datasourceId: 'my-datasource',
});
console.log(response.explanation);
if (response.dashboard) {
loadDashboard(response.dashboard);
}Streaming
Pattern 1: for-await
const stream = await client.ai.chat.sendMessage({
message: 'Create a dashboard showing customer distribution by region',
datasourceId: 'my-datasource',
stream: true,
});
for await (const event of stream) {
switch (event.type) {
case 'progress': console.log('Status:', event.message); break;
case 'text': document.getElementById('chat-message').textContent += event.content; break;
case 'error': console.error('Error:', event.error); break;
}
}Pattern 2: Event Listeners
const stream = await client.ai.chat.sendMessage({
message: 'Create a dashboard showing customer distribution by region',
datasourceId: 'my-datasource',
stream: true,
});
stream.on('progress', (message) => console.log('Status:', message));
stream.on('text', (content) => {
document.getElementById('chat-message').textContent += content;
});
stream.on('error', (error) => console.error('Error:', error));
const result = await stream.finalResponse();
console.log('Complete:', result.explanation);
if (result.dashboard) {
loadDashboard(result.dashboard);
}Pattern 3: Aggregated Result
const stream = await client.ai.chat.sendMessage({
message: 'Create a dashboard showing customer distribution by region',
datasourceId: 'my-datasource',
stream: true,
});
const result = await stream.finalResponse();
console.log(result.explanation);
if (result.dashboard) {
loadDashboard(result.dashboard);
}Dashboard Editing
Provide an existing dashboard for modification:
const response = await client.ai.chat.sendMessage({
message: 'Add a date filter to this dashboard',
datasourceId: 'my-datasource',
dashboard: revealView.dashboard, // RVDashboard object or JSON string
});
if (response.dashboard) {
loadDashboard(response.dashboard);
}Reset Chat Context
await client.ai.chat.resetContext();
console.log('Conversation history cleared');Request Parameters
interface ChatRequest {
message: string; // User's natural language input (required)
datasourceId?: string; // Datasource identifier
dashboard?: string | RVDashboard; // Dashboard JSON or RVDashboard object
visualizationId?: string; // Visualization ID for visualization-specific context
intent?: string; // Intent for freeform LLM queries
updateChatState?: boolean; // Whether to update chat state
model?: string; // Override LLM model
signal?: AbortSignal; // For request cancellation
stream?: boolean; // Enable streaming (default: false)
}Response Types
interface ChatResponse {
explanation?: string; // AI-generated explanation
dashboard?: string; // Generated/modified dashboard JSON
error?: string; // Error message if request failed
}Stream Events
Both insights and chat streaming use the same event types:
type AIStreamEvent =
| { type: 'progress'; message: string }
| { type: 'text'; content: string }
| { type: 'error'; error: string; details?: unknown };| Event Type | Description |
|------------|-------------|
| progress | Status messages during processing (e.g., "Creating a new dashboard") |
| text | Text fragments of the explanation as they are generated |
| error | Error information if processing fails |
Common Patterns
Context Menu Integration
revealView.onMenuOpening = function (visualization, args) {
if (args.menuLocation === $.ig.RVMenuLocation.Dashboard) {
args.menuItems.push(new $.ig.RVMenuItem("Summary", null, async () => {
const insight = await client.ai.insights.get({
dashboard: revealView.dashboard,
type: 'summary',
});
displayInsight(insight.explanation);
}));
}
if (args.menuLocation === $.ig.RVMenuLocation.Visualization) {
args.menuItems.push(new $.ig.RVMenuItem("Analyze This", null, async () => {
const insight = await client.ai.insights.get({
dashboard: revealView.dashboard,
visualizationId: visualization.id,
type: 'analysis',
});
displayInsight(insight.explanation);
}));
}
};Building a Chat Interface with Streaming
async function sendChatMessage(userInput) {
let currentMessage = '';
const stream = await client.ai.chat.sendMessage({
message: userInput,
datasourceId: 'my-datasource',
stream: true,
});
stream.on('progress', (message) => {
showProgressIndicator(message);
});
stream.on('text', (content) => {
currentMessage += content;
updateStreamingUI(currentMessage);
});
stream.on('error', (error) => {
showError(error);
});
const result = await stream.finalResponse();
if (result.dashboard) {
loadDashboard(result.dashboard);
}
}Streaming Display with Markdown
let buffer = '';
const stream = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
stream: true,
});
stream.on('text', (content) => {
buffer += content;
document.getElementById('output').innerHTML = marked.parse(buffer);
});
const result = await stream.finalResponse();Request Cancellation
const controller = new AbortController();
// Non-streaming
const promise = client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
signal: controller.signal,
});
// Cancel after 5 seconds
setTimeout(() => controller.abort(), 5000);
// Streaming
const stream = await client.ai.chat.sendMessage({
message: 'Analyze my data',
datasourceId: 'my-datasource',
stream: true,
signal: controller.signal,
});
// Or abort the stream directly
stream.abort();Error Handling
// Non-streaming
try {
const insight = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
});
displayInsight(insight.explanation);
} catch (error) {
if (error.status) {
console.error(`API Error ${error.status}:`, error.message);
} else {
console.error('Network error:', error.message);
}
}
// Streaming
const stream = await client.ai.insights.get({
dashboardId: 'sales-dashboard',
type: 'summary',
stream: true,
});
stream.on('text', (content) => appendToUI(content));
stream.on('error', (error) => {
console.error('Insight generation failed:', error);
showErrorMessage(error);
});
await stream.finalResponse();Using with React
import { useState, useCallback } from 'react';
import { RevealSdkClient } from '@revealbi/api';
function DashboardInsights({ dashboardId }) {
const [loading, setLoading] = useState(false);
const [streamedText, setStreamedText] = useState('');
const [error, setError] = useState(null);
const fetchInsights = useCallback(async () => {
setLoading(true);
setError(null);
setStreamedText('');
try {
const client = RevealSdkClient.getInstance();
const stream = await client.ai.insights.get({
dashboardId,
type: 'summary',
stream: true,
});
stream.on('text', (chunk) => {
setStreamedText(prev => prev + chunk);
});
stream.on('progress', (message) => {
console.log('Progress:', message);
});
await stream.finalResponse();
} catch (err) {
setError(err.message);
} finally {
setLoading(false);
}
}, [dashboardId]);
return (
<div>
<button onClick={fetchInsights} disabled={loading}>
{loading ? 'Loading...' : 'Get Insights'}
</button>
{error && <div className="error">{error}</div>}
{streamedText && <div className="streaming">{streamedText}</div>}
</div>
);
}Framework Integration
Angular
// main.ts
import { RevealSdkClient } from '@revealbi/api';
RevealSdkClient.initialize({
hostUrl: 'https://your-server.com'
});
platformBrowserDynamic().bootstrapModule(AppModule);React
// main.tsx
import { RevealSdkClient } from '@revealbi/api';
RevealSdkClient.initialize({
hostUrl: 'https://your-server.com'
});
ReactDOM.createRoot(document.getElementById('root')!).render(<App />);Vue
// main.ts
import { RevealSdkClient } from '@revealbi/api';
RevealSdkClient.initialize({
hostUrl: 'https://your-server.com'
});
createApp(App).mount('#app');API Reference
RevealSdkClient
| Method | Description |
|--------|-------------|
| RevealSdkClient.initialize({ hostUrl }) | Initialize the singleton instance |
| RevealSdkClient.getInstance() | Get the singleton instance |
AI Operations
Insights (client.ai.insights)
| Method | Returns | Description |
|--------|---------|-------------|
| get(request) | Promise<InsightResponse> | Get AI insight (non-streaming) |
| get({ ...request, stream: true }) | Promise<AIStream<InsightResponse>> | Get AI insight (streaming) |
Chat (client.ai.chat)
| Method | Returns | Description |
|--------|---------|-------------|
| sendMessage(request) | Promise<ChatResponse> | Send a chat message (non-streaming) |
| sendMessage({ ...request, stream: true }) | Promise<AIStream<ChatResponse>> | Send a chat message (streaming) |
| resetContext() | Promise<void> | Clear conversation history |
Browser Compatibility
- Chrome/Edge 90+
- Firefox 88+
- Safari 14+
Requires: ES2020+, Fetch API, AbortController, Promise/async-await, Server-Sent Events (SSE)
License
See LICENSE file for details.
