contentstack-agent-sdk
v1.0.4
Published
SDK for querying Contentstack with LLMs
Readme
ContentStack Agent SDK - Create Domain Specific AI Chatbots without any efforts
This SDK let's you add AI chatbot by just using a react hook. Yes, just install this package, setup some basic configurations and use hook to send user query to that function and get AI powered response back (AI will use your contentstack content to answer users query).
Installation
npm install contentstack-agent-sdk
# or
yarn add contentstack-agent-sdkUsage
import { useAgent } from "contentstack-agent-sdk";
const { response, loading, error, askStream, isStreaming } = useAgent({
baseUrl: 'http://localhost:8080/api/v1/ask', // Replace with your backend URL
llmApiKey: process.env.GOOGLE_API_KEY!, // Replace with your LLM API key. Remove this "!" if you are not using Typescript
contentstackApiKey: process.env.NEXT_PUBLIC_CONTENTSTACK_API_KEY!, // Replace with your Contentstack API key. Remove this "!" if you are not using Typescript
llmProvider: 'groq', // or 'groq', depending on your LLM provider
});Quickstart
'use client';
import { useState } from "react";
import { useAgent } from "contentstack-agent-sdk";
export default function Chatbot() {
const { response, loading, error, askStream, isStreaming } = useAgent({
baseUrl: "http://localhost:8080/api/v1/ask",
llmApiKey: process.env.GOOGLE_API_KEY!,
contentstackApiKey: process.env.NEXT_PUBLIC_CONTENTSTACK_API_KEY!,
llmProvider: "google",
});
const [input, setInput] = useState("");
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!input.trim()) return;
await askStream(input);
setInput("");
};
return (
<div style={{ maxWidth: 500, margin: "0 auto", padding: 20 }}>
<h2>Chatbot</h2>
{/* Messages */}
<div style={{ border: "1px solid #ddd", padding: 10, minHeight: 200 }}>
{loading && <p>Thinking...</p>}
{error && <p style={{ color: "red" }}>{error.message}</p>}
{response && (
<p>
<strong>Bot:</strong> {response}
{isStreaming && <span style={{ opacity: 0.5 }}> ⌛</span>}
</p>
)}
</div>
{/* Input */}
<form onSubmit={handleSubmit} style={{ marginTop: 10 }}>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type your message..."
style={{ width: "80%", padding: 8 }}
/>
<button type="submit" disabled={loading} style={{ padding: 8 }}>
Send
</button>
</form>
</div>
);
}API Reference
useAgent(config)
Hook that connects your frontend to the backend AI service.
Config options:
| Key | Type | Required | Description |
| -------------------- | -------- | -------- | ------------------------------------------------ |
| baseUrl | string | ✅ | Backend endpoint to forward queries. |
| llmApiKey | string | ✅ | API key for your chosen LLM provider. |
| contentstackApiKey | string | ✅ | Contentstack API key used to fetch your content. |
| llmProvider | string | ✅ | LLM provider ("google" or "groq"). |
Note: Currently there are only 2 LLM providers google and groq.
Returns:
| Variable | Type | Description |
| ------------- | ---------------------------------- | --------------------------------------- |
| response | string | AI-generated response. |
| loading | boolean | Whether the request is in progress. |
| error | Error \| null | Error object if the request fails. |
| askStream | (query: string) => Promise<void> | Function to send user input. |
| isStreaming | boolean | True if AI response is still streaming. |
