@clad-ai/react
v0.1.7
Published
React SDK to seamlessly inject ad provided prompts into LLM workflows.
Maintainers
Readme
@clad-ai/react
Table of Contents
Overview
Clad provides a lightweight React SDK for seamless integration of dynamic monetization into LLM-driven chat workflows. It abstracts internal logic for tracking, injection, and display, offering a secure interface optimized for modern React applications — including full support for Next.js.
⚠️ This SDK is proprietary and intended for authorized Clad Labs clients only. Use or redistribution without permission is strictly prohibited.
Installation
npm install @clad-ai/reactInstantiating CladClient
Before calling any method, create an instance of CladClient with an API key
import { CladClient } from '@clad-ai/react';
const cladClient = new CladClient({
apiKey: process.env.NEXT_PUBLIC_CLAD_API_KEY,
threshold: 3, // Optional: how many messages before calling API (default: 3)
filteredKeywords: ['starbucks', 'gambling', 'crypto', 'adult', 'politics', 'violence']
});Parameters:
apiKey: str— API key provided by Clad. Contact [email protected] to get yours.threshold: int— Optional. Number of messages before triggering an API call. Defaults to 3.filteredKeywords: array— Optional. keywords to filter out specific ads from being displayed.
Functions
getOrCreateUserId
Usage:
import { CladClient } from '@clad-ai/react';
const cladClient = new CladClient({
apiKey: process.env.NEXT_PUBLIC_CLAD_API_KEY,
threshold: 3,
});
const userId = cladClient.getOrCreateUserId();Creates or retrieves a persistent UUID from localStorage for contexual tracking. Pass this into getProcessedInput function as the userId, this is required.
getProcessedInput
Usage:
import { CladClient } from '@clad-ai/react';
const cladClient = new CladClient({
apiKey: process.env.NEXT_PUBLIC_CLAD_API_KEY,
threshold: 3,
});
const response = await cladClient.getProcessedInput("I'm looking for shoes", userId, "true");
console.log(response.prompt); // Final prompt with or without adParameters:
userInput: string— User's chat inputuserId: string— Persistent user ID. Call getOrCreateUserId, this is required.discrete: 'true' | 'false'— Whether to mark ad content explicitly
Returns:
{
prompt: string;
promptType: 'clean' | 'injected';
link: string;
discrete: 'true' | 'false';
adType?: string;
image_url?: string;
}The function returns processed input only when engagement thresholds are met based on user interaction patterns. Otherwise, it returns the original input.
Returns:
string— a UUIDv4 string
AdCard
Usage:
import { AdCard, cladClient } from '@clad-ai/react';
const cladClient = new CladClient({
apiKey: process.env.NEXT_PUBLIC_CLAD_API_KEY,
threshold: 3,
});
const response = await cladClient.getProcessedInput("I'm looking for shoes", userId, "true");
// Checks if the ad should be injected or not
{response.promptType === 'injected' && (
<div className="mt-4">
<AdCard
prompted={response}
className="my-2"
/>
</div>
)}Props:
{
// map the getProcessedInput response directly to 'prompted' field
prompted: {
prompt: string;
promptType: 'clean' | 'injected';
link: string;
discrete: 'true' | 'false';
adType?: string;
image_url?: string;
};
className?: string; // customize UI to your needs
}Custom component to an Ad Banner (Image with CTA link). Only renders when promptType === 'injected' and a link exists.
Full Example
This a full example of our Clad-SDK integration with a barebones 'chatbot' to remove distractions
import React, { useState, useEffect } from 'react';
// 1. Clad Imports
import { AdCard, CladClient } from '@clad-ai/react';
export default function App() {
const [latestResponse, setLatestResponse] = useState(null); // store response globally
const [llmResponse, setLlmResponse] = useState("");
// 2. Initialize Clad Client
const cladClient = new CladClient({
apiKey: process.env.REACT_APP_CLAD_API_KEY,
threshold: 1,
});
// 3. Initialize User ID
const userId = cladClient.getOrCreateUserId();
useEffect(() => {
const run = async () => {
// 4. Call function to inject ad into inline text response
const response = await cladClient.getProcessedInput(
"Im looking for an e-bike reccommendation to get around the city", // This would be the text input from your client to your LLM
userId,
"true"
);
console.log("Response:", response);
console.log("Prompt:", response.prompt);
setLatestResponse(response);
const mockResponse = await callOpenAI(latestResponse.prompt); // mock call to LLM api
setLlmResponse(mockResponse)
};
run();
}, []);
return (
<div className="chat-container">
<div className="messages">
<p>{llmResponse}</p>
</div>
{/* 5. Conditionally render the AdCard if the response is injected */}
{latestResponse?.promptType === 'injected' && (
<div className="mt-4">
<AdCard prompted={latestResponse} className="my-2" />
</div>
)}
<div className="input-box">
<input
placeholder="Type your message..."
/>
<button>Send</button>
</div>
</div>
);
}
Support
For help, email us at [email protected]
This software is proprietary and confidential.
Unauthorized copying, distribution, or use of this software is strictly prohibited without express written permission from Clad Labs.
© 2025 Clad Labs. All rights reserved.
