@brander/sdk
v0.3.18
Published
BranderUX SDK - Transform AI chat into interactive interfaces
Readme
@brander/sdk
Official React SDK for embedding BranderUX AI-powered UI widgets in your application.
CLOSED BETA - Beta Access Required
This package is currently in closed beta. A beta API token is required to use the widget. Without a valid token, you'll see "Access denied" errors.
To request beta access: Sign in at branderux.com and submit a request through the contact form.
Installation
npm install @brander/sdk
# or
yarn add @brander/sdkNote: Installation is public, but usage requires a beta access token.
Quick Start (Streaming - Recommended)
The simplest way to integrate BranderUX is with streaming using sseStream:
import Brander, { sseStream } from "@brander/sdk";
function App() {
return (
<Brander
betaKey="bux_your_token"
projectId="your_project_id"
onQueryStream={(params) => sseStream("/api/agent", { params })}
/>
);
}This one-liner connects to your backend endpoint that returns Server-Sent Events (SSE) with AG-UI events.
Stream Adapters
BranderUX provides adapters to convert AI provider streams to AG-UI events:
sseStream - For Backend Endpoints (Most Common)
When your backend returns AG-UI events as SSE:
import { sseStream } from "@brander/sdk";
<Brander
onQueryStream={(params) => sseStream("/api/agent", { params })}
/>anthropicStream - For Anthropic SDK
When calling Anthropic directly from the client:
import { Brander, anthropicStream } from "@brander/sdk";
import Anthropic from "@anthropic-ai/sdk";
const anthropic = new Anthropic({ apiKey: "..." });
<Brander
onQueryStream={async function*(params) {
const stream = anthropic.messages.stream({
model: "claude-sonnet-4-20250514",
messages: params.messages,
tools: params.tools.anthropic,
});
yield* anthropicStream(stream);
}}
/>openaiStream - For OpenAI SDK
import { Brander, openaiStream } from "@brander/sdk";
import OpenAI from "openai";
const openai = new OpenAI({ apiKey: "..." });
<Brander
onQueryStream={async function*(params) {
const stream = await openai.chat.completions.create({
model: "gpt-4o",
messages: params.messages,
tools: params.tools.openai,
stream: true,
});
yield* openaiStream(stream);
}}
/>geminiStream - For Google Gemini SDK
import { Brander, geminiStream } from "@brander/sdk";
import { GoogleGenerativeAI } from "@google/generative-ai";
const genAI = new GoogleGenerativeAI("...");
<Brander
onQueryStream={async function*(params) {
const model = genAI.getGenerativeModel({
model: "gemini-2.5-flash",
tools: params.tools.gemini,
});
const result = await model.generateContentStream({
contents: params.messages.map(m => ({
role: m.role === "assistant" ? "model" : "user",
parts: [{ text: m.content }],
})),
});
yield* geminiStream(result.stream);
}}
/>Agent Framework Examples
BranderUX works seamlessly with popular agent frameworks. Since these frameworks support AG-UI natively, integration is straightforward.
LangGraph (Most Popular)
# Backend: Python + LangGraph
from langgraph.prebuilt import create_react_agent
from langchain_anthropic import ChatAnthropic
from ag_ui_langgraph import AGUIAdapter
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
app = FastAPI()
model = ChatAnthropic(model="claude-sonnet-4-20250514")
agent = create_react_agent(model, tools=[...])
@app.post("/api/agent")
async def agent_endpoint(request: dict):
return StreamingResponse(
AGUIAdapter.stream(agent, request),
media_type="text/event-stream"
)// Frontend: React + BranderUX
import Brander, { sseStream } from "@brander/sdk";
<Brander
betaKey="bux_your_token"
projectId="your_project_id"
onQueryStream={(params) => sseStream("/api/agent", { params })}
/>CrewAI (Multi-Agent)
# Backend: Python + CrewAI
from crewai import Agent, Task, Crew
from crewai.agui import CrewAGUIAdapter
@app.post("/api/agent")
async def agent_endpoint(request: dict):
crew = Crew(agents=[...], tasks=[...])
return StreamingResponse(
CrewAGUIAdapter.stream(crew),
media_type="text/event-stream"
)Google ADK
# Backend: Python + Google ADK
from google.adk import Agent
from google.adk.agui import ADKAGUIMiddleware
agent = Agent(model="gemini-2.5-flash", tools=[...])
@app.post("/api/agent")
async def agent_endpoint(request: dict):
return ADKAGUIMiddleware.handle(agent, request)Pydantic AI
# Backend: Python + Pydantic AI
from pydantic_ai import Agent
from pydantic_ai.agui import AGUIAdapter
agent = Agent("anthropic:claude-sonnet-4-20250514")
@app.post("/api/agent")
async def agent_endpoint(request: dict):
return AGUIAdapter.dispatch_request(agent, request)Mastra (TypeScript)
// Backend: Node.js + Mastra
import { Agent } from "@mastra/core";
import { anthropic } from "@ai-sdk/anthropic";
const agent = new Agent({
name: "assistant",
model: anthropic("claude-sonnet-4-20250514"),
tools: { /* your tools */ },
});
export async function POST(request: Request) {
const { messages, tools } = await request.json();
const stream = await agent.stream({ messages, tools });
return new Response(stream, {
headers: { "Content-Type": "text/event-stream" },
});
}Components
Brander- Inline embeddable widget componentBranderChatWidget- Floating chat widget with trigger button
Props
Required Props
betaKey: string
Your BranderUX beta API token (format: bux_dp_xxxxx). Get this from your beta access email.
projectId: string
Your BranderUX project ID. Get this from your BranderUX dashboard.
AI Handler (ONE of the following)
onQueryStream: StreamingCallback (Recommended)
Streaming AI handler using AG-UI events. Provides progressive UI loading for better UX.
type StreamingCallback = (
params: CustomerAIParams
) => AsyncIterable<AGUIEvent> | ReadableStream<AGUIEvent>;Use with stream adapters: sseStream, anthropicStream, openaiStream, geminiStream.
onQuery: OnQueryCallback (Alternative)
Non-streaming AI handler. Returns complete response at once.
type OnQueryCallback = (
params: CustomerAIParams
) => Promise<CustomerAIResponse>;CustomerAIParams
interface CustomerAIParams {
system?: string; // System instructions (A2UI protocol in flexible mode)
messages: Array<{
role: "user" | "assistant";
content: string;
}>;
tools?: MultiProviderTools; // Tools in all 3 provider formats
max_tokens?: number;
}
interface MultiProviderTools {
anthropic: AnthropicTool[]; // { name, description, input_schema }
openai: OpenAITool[]; // { type: "function", function: { name, parameters } }
gemini: GeminiTool[]; // { name, description, parameters }
}Optional Props
conversations?: Conversation[]
Initial conversations to load. Use with activeConversationId and onConversationsChange for persistence.
activeConversationId?: string
ID of the currently active conversation.
onConversationsChange?: (state: ConversationsState) => void
Callback when conversations change. Use to save conversation state.
variant?: "hybrid" | "classic" | "chat"
Display variant. Default: "chat"
"chat": Chat-style interface with inline messages"classic": Clean site-focused view with conversation sidebar"hybrid": Full playground experience with scroll-snapping
defaultSidebarOpen?: boolean
Default state of conversation sidebar in classic variant. Default: true
width?: string
Widget width. Default: "100%"
height?: string
Widget height. Default: "600px"
className?: string
CSS class name for the container.
style?: React.CSSProperties
Inline styles for the container.
Conversation Persistence
The widget automatically manages conversation history. Optionally persist across sessions:
import Brander, { sseStream } from "@brander/sdk";
import { useState, useEffect } from "react";
function App() {
const [conversations, setConversations] = useState([]);
const [activeId, setActiveId] = useState(null);
// Load from storage on mount
useEffect(() => {
const saved = localStorage.getItem("brander_conversations");
if (saved) {
const state = JSON.parse(saved);
setConversations(state.conversations);
setActiveId(state.activeConversationId);
}
}, []);
return (
<Brander
betaKey="bux_your_token"
projectId="your_project_id"
onQueryStream={(params) => sseStream("/api/agent", { params })}
conversations={conversations}
activeConversationId={activeId}
onConversationsChange={(state) => {
setConversations(state.conversations);
setActiveId(state.activeConversationId);
localStorage.setItem("brander_conversations", JSON.stringify(state));
}}
/>
);
}BranderChatWidget Component
Floating chat widget with customizable trigger button.
import { BranderChatWidget, sseStream } from "@brander/sdk";
function App() {
return (
<div>
<h1>My Application</h1>
<BranderChatWidget
betaKey="bux_your_token"
projectId="your_project_id"
onQueryStream={(params) => sseStream("/api/agent", { params })}
position="bottom-right"
>
<button style={{
padding: '12px 24px',
borderRadius: '50px',
backgroundColor: '#0066FF',
color: 'white',
border: 'none',
}}>
Chat with AI
</button>
</BranderChatWidget>
</div>
);
}BranderChatWidget Props
Same as Brander plus:
children- Custom trigger button/element (required)position- Widget position:"bottom-right"|"bottom-left"|"top-right"|"top-left"(default:"bottom-right")offset- Offset from edges:{ top?: number, bottom?: number, left?: number, right?: number }widgetSize- Widget dimensions:{ width?: string, height?: string }defaultOpen- Initially open state (default:false)onOpen/onClose- Open/close callbacksshowBackdrop- Show backdrop overlay (default:true)backdropOpacity- Backdrop opacity 0-1 (default:0.2)closeOnBackdropClick- Close on backdrop click (default:true)zIndex- z-index for widget (default:9999)animationDuration- Animation duration in ms (default:300)
TypeScript Support
Full TypeScript support included:
import Brander, {
// Components
BranderChatWidget,
// Types
BranderProps,
BranderChatWidgetProps,
CustomerAIParams,
CustomerAIResponse,
// Multi-provider tools
MultiProviderTools,
AnthropicTool,
OpenAITool,
GeminiTool,
// Conversations
Conversation,
ConversationsState,
QueryHistoryItem,
// AG-UI Streaming
AGUIEvent,
AGUIEventType,
StreamingCallback,
ToolCallStartEvent,
ToolCallArgsEvent,
ToolCallEndEvent,
RunErrorEvent,
// Stream adapters
sseStream,
anthropicStream,
anthropicToAGUI,
openaiStream,
openaiToAGUI,
geminiStream,
geminiToAGUI,
} from "@brander/sdk";AG-UI Event Types
enum AGUIEventType {
RUN_STARTED = "RUN_STARTED",
RUN_FINISHED = "RUN_FINISHED",
RUN_ERROR = "RUN_ERROR",
TEXT_MESSAGE_START = "TEXT_MESSAGE_START",
TEXT_MESSAGE_CONTENT = "TEXT_MESSAGE_CONTENT",
TEXT_MESSAGE_END = "TEXT_MESSAGE_END",
TOOL_CALL_START = "TOOL_CALL_START",
TOOL_CALL_ARGS = "TOOL_CALL_ARGS",
TOOL_CALL_END = "TOOL_CALL_END",
}
interface ToolCallStartEvent {
type: AGUIEventType.TOOL_CALL_START;
toolCallId: string;
toolCallName: string;
timestamp: number;
}
interface ToolCallArgsEvent {
type: AGUIEventType.TOOL_CALL_ARGS;
toolCallId: string;
delta: string; // Partial JSON
timestamp: number;
}
interface ToolCallEndEvent {
type: AGUIEventType.TOOL_CALL_END;
toolCallId: string;
timestamp: number;
}
type AGUIEvent =
| ToolCallStartEvent
| ToolCallArgsEvent
| ToolCallEndEvent
| RunErrorEvent
| /* ... other events */;Alternative: Non-Streaming Integration
For simple integrations that don't need streaming:
import Brander from "@brander/sdk";
function App() {
const handleQuery = async (params) => {
// BranderUX instructions are included in params.messages
const response = await fetch("/api/ai", {
method: "POST",
body: JSON.stringify(params),
});
return response.json();
};
return (
<Brander
betaKey="bux_your_token"
projectId="your_project_id"
onQuery={handleQuery}
/>
);
}With Tools API
// Anthropic
const onQuery = async (params) => {
const response = await anthropic.messages.create({
model: "claude-sonnet-4-20250514",
messages: params.messages,
tools: params.tools.anthropic,
max_tokens: params.max_tokens || 4000,
});
return { content: response.content };
};
// OpenAI
const onQuery = async (params) => {
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: params.messages,
tools: params.tools.openai,
max_tokens: params.max_tokens || 4000,
});
const message = response.choices[0].message;
if (message.tool_calls) {
return {
content: message.tool_calls.map(tc => ({
type: "tool_use",
name: tc.function.name,
input: JSON.parse(tc.function.arguments),
})),
};
}
return { content: [{ type: "text", text: message.content }] };
};
// Gemini
const onQuery = async (params) => {
const model = genAI.getGenerativeModel({
model: "gemini-2.5-flash",
tools: [{ functionDeclarations: params.tools.gemini }],
});
// ... handle response
};FAQ
Why do I see "Access denied"?
The @brander/sdk is currently in closed beta. You need a beta access token (starting with bux_dp_) to use it. Sign in at branderux.com and submit a request.
Which integration method should I use?
onQueryStreamwithsseStream- Recommended for most users. Best UX with progressive loading.onQueryStreamwith provider adapters - For direct SDK usage without a backend.onQuery- For simple integrations that don't need streaming.
How do I get beta access?
Sign in at branderux.com and submit a request through the contact form.
License
Private - Authorized users only
