@yashchavanweb/cms-chat-agent-sdk
v1.0.5
Published
The **Chat Agent SDK** provides developers with a plug-and-play solution to embed **domain-specific chat agents** powered by **Contentstack data**.
Readme
Chat Agent SDK for Developers
The Chat Agent SDK provides developers with a plug-and-play solution to embed domain-specific chat agents powered by Contentstack data.
Set Up the Chat Agent SDK
Step 1: Install the SDK
Using npm:
npm install @yashchavanweb/cms-chat-agent-sdkUsing Yarn:
yarn add @yashchavanweb/cms-chat-agent-sdkStep 2: Add Tailwind CSS (via CDN
The SDK uses Tailwind CSS for styling. Add this to your <head> in index.html:
<script src="https://cdn.jsdelivr.net/npm/@tailwindcss/browser@4"></script>Step 3: Configure the Chat Agent
Import the required libraries:
import {
ChatAgent,
ChatAgentProvider,
darkChatConfig,
lightChatConfig,
} from "@yashchavanweb/cms-chat-agent-sdk";
Wrap your application with the ChatAgentProvider:
const App = () => {
return (
<ChatAgentProvider config={chatConfig}>
{/* Child Components */}
</ChatAgentProvider>
);
};
export default App;
Configure and add the Chat Agent:
const App = () => {
const chatConfig = {
...darkChatConfig,
apiKey: "your_api_key",
};
return (
<ChatAgentProvider config={chatConfig}>
<ChatAgent config={chatConfig} /> {/* Chat Agent Component */}
</ChatAgentProvider>
);
};
export default App;
Run your application:
npm run devYou’ll now see a Chat Agent on your website.
Use the API Key in your frontend
To get the API key refer this article: View Here
Add the key to your .env file:
VITE_CHAT_AGENT_API_KEY = your_api_key;Update your App.tsx:
const App = () => {
const chatConfig = {
...darkChatConfig,
apiKey: import.meta.env.VITE_CHAT_AGENT_API_KEY,
};
return (
<ChatAgentProvider config={chatConfig}>
<ChatAgent config={chatConfig} />
</ChatAgentProvider>
);
};
export default App;Now test your Chat Agent:
Platform Architecture
Overview
- Frontend SDK → React-based chat interface
- Middleware → Validates API keys, ensuring secure access
- Backend Server → Processes validated requests
- Contentstack (MCP Server) → Content management and delivery
- LLM Services → OpenAI, Gemini, Groq, Hugging Face, etc.
5. Technology Stack
Frontend
Backend & AI/ML
DevOps
6. Customization
The SDK comes with light and dark themes (
lightChatConfig,darkChatConfig) and supports advanced customization.Below are the examples of other customization options:
Examples
1. Dimensions
const chatConfig = {
...lightChatConfig,
width: "400px",
height: "500px",
};2. Borders
const chatConfig = {
...lightChatConfig,
borderRadius: "4rem",
};3. Shadows
const chatConfig = {
...lightChatConfig,
boxShadow: "0 25px 50px 50px rgba(1, 1, 1, 1)",
};4. Agent Metadata
const chatConfig = {
...lightChatConfig,
botName: "Yash Website Chat Agent",
botAvatarUrl: "https://cdn-icons-png.flaticon.com/512/4944/4944377.png",
userAvatarUrl: "https://shorturl.at/xh1PO",
};Note: There are even more customization options, which you can checkout at the detailed documentation.
Model Toggle
The developer just has to add the provider and model in the chat config:
const chatConfig = {
...lightChatConfig,
borderRadius: "4rem",
provider: "openai",
model: "gpt-5",
};Unique Features
- 🎙️ Voice input & output support
- 💾 Save chat agent responses
- ⚡ Choose between streaming or REST responses
- 🚦 Built-in rate limiting per user
- 🔀 Toggle between multiple providers & LLM models seamlessly
FAQ
Q: Do I need a backend?
Ans: No, the SDK handles it for you. You just need to configure your credentials and follow the necessary import steps.
Q: Can I use this with frameworks other than React?
Ans: Currently, the SDK is optimized for React and NextJS. Support for more frameworks is planned.
Q: How fast are responses?
Ans: With cache hits, responses are typically 7-8 seconds faster compared to fresh queries.
For wague questions, it may take upto 12-15 seconds.
