@simpelconstructiontech/simpel-ai
v0.1.11
Published
A drop-in React chat component that renders AI responses as interactive UI — not plain text. Built on [OpenUI Lang](https://www.openui.com/docs/openui-lang/overview) and [shadcn/ui](https://ui.shadcn.com/), the SDK turns LLM output into cards, charts, for
Downloads
1,221
Readme
Simpel SDK
A drop-in React chat component that renders AI responses as interactive UI — not plain text. Built on OpenUI Lang and shadcn/ui, the SDK turns LLM output into cards, charts, forms, tables, dialogs, and 45+ other components, streamed in real time.
Quick Start
1. Environment variables
Create a .env.local in your app:
NEXT_PUBLIC_SIMPEL_BACKEND_URL=https://your-convex-site.convex.site
NEXT_PUBLIC_SIMPEL_API_KEY=your-api-key2. Import the CSS
The SDK ships two CSS files — both are required:
/* globals.css */
@import "tailwindcss";
/* Theme tokens (OKLCH colors, radii, dark mode) + component utility classes */
@import "@simpelconstructiontech/simpel-ai/styles.css";
/* OpenUI component styles (Shell, Composer, Charts, etc.) */
@import "@simpelconstructiontech/simpel-ai/components.css";Next.js: Add transpilePackages to your config:
// next.config.ts
const nextConfig = {
transpilePackages: [
"@simpelconstructiontech/simpel-ai",
"@openuidev/react-ui",
"@openuidev/react-headless",
"@openuidev/react-lang",
],
};3. Add the component
import { SimpelChat } from "@simpelconstructiontech/simpel-ai";
export default function App() {
return (
<SimpelChat
agentName="My Assistant"
description="Ask me anything"
logoUrl="/logo.png"
conversationStarters={[
{ displayText: "Dashboard", prompt: "Build me an analytics dashboard" },
{ displayText: "Help", prompt: "What can you do?" },
]}
/>
);
}That's it. The component renders with a header, thread management, conversation starters, and a composer.
Exports
Everything is exported from the package entry point. The CSS must be imported separately.
import {
// Chat UI
SimpelChat,
// Standalone renderers (for custom layouts)
GenUIRenderer,
MarkdownView,
SimpelRenderer,
// Imperative streaming
generate,
// React hooks
useGenerate,
useSkills,
// Tools & Data Providers
defineTool,
defineDataProvider,
// Skills CRUD
fetchSkills, loadSkills, createSkill, updateSkill, deleteSkill, generateSkillContent,
// Theme
ThemeProvider, useTheme,
// System prompt (for backend reference)
OPENUI_SYSTEM_PROMPT,
// Action types
BuiltinActionType,
} from "@simpelconstructiontech/simpel-ai";CSS Exports
| Export path | Contents |
|---|---|
| @simpelconstructiontech/simpel-ai/styles.css | Tailwind v4 theme tokens (OKLCH colors, border-radius, dark mode), utility classes, base layer rules |
| @simpelconstructiontech/simpel-ai/components.css | OpenUI component styles (Shell, Composer, Messages, Charts, all rendered DSL components) |
SimpelChat Props
| Prop | Type | Default | Description |
|------|------|---------|-------------|
| agentName | string | "Simpel Chat" | Display name in the header and welcome screen |
| description | string | "Ask me anything" | Short description shown below the agent name on the welcome screen |
| logoUrl | string | — | URL to the bot logo image (served from your app's public folder) |
| conversationStarters | ConversationStarter[] | [] | Suggested prompts shown on the welcome screen |
| helpConversationStarters | ConversationStarter[] | — | Starters shown in "help" mode (falls back to conversationStarters) |
| dataProviders | DataProvider[] | — | Data providers resolved on mount. Their results become persistent context for the entire thread (see Data Providers vs Tools) |
| skills | string[] | — | Skill names to activate for every message (must match control center names) |
| skillTags | string[] | — | Skill tags to activate for every message |
| defaultMode | "general" \| "help" | "general" | Initial chat mode. "general" uses /chat, "help" uses /rag-chat |
| showModeToggle | boolean | false | Shows a toggle to switch between general and help modes |
interface ConversationStarter {
displayText: string; // Button label
prompt: string; // Message sent to the LLM
}Standalone Renderers
For custom layouts where you don't want the full chat UI, use the renderers directly.
GenUIRenderer
Renders OpenUI Lang DSL into interactive shadcn/ui components:
import { GenUIRenderer, BuiltinActionType } from "@simpel/sdk";
<GenUIRenderer
content={dslString}
isStreaming={false}
onAction={(event) => {
if (event.type === BuiltinActionType.ContinueConversation) {
// user clicked a follow-up
}
}}
/>MarkdownView
Renders markdown with GitHub-flavored markdown support:
import { MarkdownView } from "@simpel/sdk";
<MarkdownView content="# Hello **world**" />SimpelRenderer
Switches between GenUI and Markdown based on format:
import { SimpelRenderer } from "@simpel/sdk";
<SimpelRenderer content={text} format="genui" isStreaming={true} />Imperative Generation
Stream responses outside of the chat component — useful for headless use cases or custom UIs.
generate()
Async generator that yields chunks as they stream in:
import { generate } from "@simpel/sdk";
for await (const chunk of generate({
prompt: "Build a dashboard",
format: "genui",
skills: ["Sales Coach"],
})) {
console.log(chunk.content); // accumulated text
}useGenerate()
React hook wrapping generate() with state management:
import { useGenerate } from "@simpel/sdk";
function MyComponent() {
const { content, isStreaming, error, generate, cancel, reset } = useGenerate();
return (
<div>
<button onClick={() => generate({ prompt: "Hello" })}>Generate</button>
{isStreaming && <button onClick={cancel}>Cancel</button>}
<GenUIRenderer content={content} isStreaming={isStreaming} />
</div>
);
}GenerateOptions
| Option | Type | Description |
|--------|------|-------------|
| prompt | string | Single user message (convenience) |
| messages | Array<{role, content}> | Full message history (overrides prompt) |
| format | "genui" \| "markdown" | Response format. Default: "genui" |
| threadId | string | Reuse an existing conversation thread |
| systemPrompt | string | Override the base system prompt on the backend |
| dataProviders | DataProvider[] | Data providers resolved before the first request. Results become persistent context (see Data Providers vs Tools) |
| dataContext | string | Pre-resolved data context string (for when you've already called resolveDataProviders yourself) |
| skills | string[] | Skill names to activate |
| skillTags | string[] | Skill tags to activate |
| customTools | CustomTool[] | Tool definitions for tool calling. Tools with execute handlers run automatically (see Data Providers vs Tools) |
| toolOptions | ToolExecutionOptions | Options for the tool execution loop (maxToolRounds, onToolCall, onToolResult) |
| responseSchema | Record<string, unknown> | JSON schema for structured output |
| segmentation | boolean | Enable YOLO segmentation for image analysis |
| image | string | Base64 data URI or URL of an image to include |
| signal | AbortSignal | Cancel the request |
Skills
Skills are reusable prompt instructions managed via the control center. The SDK provides full CRUD access.
useSkills() hook
import { useSkills } from "@simpel/sdk";
function MyComponent() {
const { skills, available, activeSkillNames, loading, create, update, remove, refresh } = useSkills();
// Reference skills by camelCase name (no string typing):
generate({ prompt: "...", skills: [available.salesCoach] });
// Or pass all active skills:
generate({ prompt: "...", skills: activeSkillNames });
}loadSkills() (imperative)
import { loadSkills } from "@simpel/sdk";
const skills = await loadSkills();
skills.salesCoach // "Sales Coach"
skills.all() // ["Sales Coach", "Image Analyzer"]
skills.byTag("sales") // ["Sales Coach"]Data Providers vs Tools
The SDK has two ways to feed application-specific data to the AI: data providers and tools. They serve different purposes.
Data Providers — static context, fetched once
A data provider runs automatically on mount, before the user sends any message. Its result becomes persistent background knowledge that the AI has for the entire thread. Every response, skill, and tool call is grounded in this data.
Use data providers for things the AI should always know: the current user, their projects, their role, their org settings.
import { SimpelChat, defineDataProvider } from "@simpelconstructiontech/simpel-ai";
const userProvider = defineDataProvider({
name: "currentUser",
description: "The logged-in user's profile, role, and permissions",
fetch: async () => {
const res = await fetch("/api/me");
return res.json();
},
});
const projectsProvider = defineDataProvider({
name: "activeProjects",
description: "All projects this user has access to with their status",
fetch: async () => {
const res = await fetch("/api/projects");
return res.json();
},
});
<SimpelChat
dataProviders={[userProvider, projectsProvider]}
skills={["ProjectDashboard"]}
/>All providers run in parallel on mount. The AI sees the resolved data as structured context on every turn:
<data name="currentUser" description="The logged-in user's profile, role, and permissions">
{"id": "u_123", "name": "Zach", "role": "admin", "org": "Acme"}
</data>
<data name="activeProjects" description="All projects this user has access to with their status">
[{"id": "p_1", "name": "Site Alpha", "status": "active"}, ...]
</data>Tools — dynamic lookups, called by the AI
A tool is a function the AI decides to call during the conversation, based on what the user asks. Tools are on-demand — the AI reasons "I need to look something up" and invokes the function. The result comes back and the AI continues.
Use tools for things that depend on the conversation: searching, creating records, looking up specific items.
import { defineTool } from "@simpelconstructiontech/simpel-ai";
const searchTool = defineTool({
name: "searchDocuments",
description: "Search project documents by keyword",
parameters: {
type: "object",
properties: { query: { type: "string" } },
required: ["query"],
},
execute: async (args) => {
const res = await fetch(`/api/documents/search?q=${args.query}`);
return JSON.stringify(await res.json());
},
});Using them together
Data providers and tools complement each other. The data provider gives the AI background knowledge, and tools let it act on user requests within that context.
<SimpelChat
dataProviders={[userProvider, projectsProvider]} // AI knows WHO + WHAT
customTools={[searchTool, createTaskTool]} // AI can DO things when asked
skills={["ProjectManager"]} // AI knows HOW to behave
/>Example flow:
- Data providers resolve on mount — AI now knows the user is "Zach", an admin on "Site Alpha"
- User asks: "Find safety docs for Site Alpha"
- AI calls
searchDocumentstool with{ query: "safety Site Alpha" }— because it already knows the project from context - Tool returns results, AI renders them using the active skill's format
Comparison
| | Data Provider | Tool |
|--|---------------|------|
| When it runs | Once, on mount (before any message) | On-demand, when the AI decides |
| Who triggers it | The SDK, automatically | The AI, during conversation |
| Result lifetime | Persistent — sent with every request in the thread | One-shot — used for the request that triggered it |
| Use case | Background knowledge (user, org, config, permissions) | Interactive actions (search, create, update, lookup) |
| Define with | defineDataProvider() | defineTool() |
| Pass via | dataProviders prop on SimpelChat / generate() | customTools prop on generate() |
Data providers with generate()
Data providers also work with the imperative generate() function:
import { generate, defineDataProvider } from "@simpelconstructiontech/simpel-ai";
const userProvider = defineDataProvider({
name: "currentUser",
description: "Current user profile",
fetch: async () => ({ name: "Zach", role: "admin" }),
});
for await (const chunk of generate({
prompt: "Show me my dashboard",
format: "genui",
dataProviders: [userProvider],
skills: ["Dashboard"],
})) {
console.log(chunk.content);
}Shadow DOM / Web Components
Global @import rules don't pierce Shadow DOM boundaries. To embed the SDK inside a shadow root (for widgets, Web Components, or iframe-free isolation), load the pre-built CSS files directly:
Option A: <link> elements
const shadow = host.attachShadow({ mode: "open" });
const link = document.createElement("link");
link.rel = "stylesheet";
link.href = new URL(
"@simpelconstructiontech/simpel-ai/dist/styles.css",
import.meta.url,
).href;
shadow.appendChild(link);
const linkComponents = document.createElement("link");
linkComponents.rel = "stylesheet";
linkComponents.href = new URL(
"@simpelconstructiontech/simpel-ai/dist/components.css",
import.meta.url,
).href;
shadow.appendChild(linkComponents);
const mount = document.createElement("div");
shadow.appendChild(mount);
// createRoot(mount).render(<SimpelChat ... />);Option B: Constructable Stylesheets (no FOUC)
const [stylesText, componentsText] = await Promise.all([
fetch(new URL(
"@simpelconstructiontech/simpel-ai/dist/styles.css",
import.meta.url,
)).then((r) => r.text()),
fetch(new URL(
"@simpelconstructiontech/simpel-ai/dist/components.css",
import.meta.url,
)).then((r) => r.text()),
]);
const sheet = new CSSStyleSheet();
sheet.replaceSync(stylesText + "\n" + componentsText);
shadow.adoptedStyleSheets = [sheet];Note: When using Shadow DOM, set the
data-theme="dark"attribute on the shadow root's inner container (not<body>) to enable dark mode. Or readprefers-color-schememanually:const isDark = matchMedia("(prefers-color-scheme: dark)").matches; mount.setAttribute("data-theme", isDark ? "dark" : "light");
Architecture
SimpelChat
ThemeProvider ─── OS light/dark detection (sets data-theme attribute)
OpenUIThemeProvider ─── OpenUI theme sync
Shell.ShellStoreProvider ─── agent name + logo
ChatProvider ─── message state, streaming, processMessage callback
ThreadPersistence ─── auto-saves to localStorage after each response
ChatInner
ModeToggle ─── general/help mode switch (optional)
ThreadDropdown ─── create, switch, delete threads
CustomWelcomeScreen ─── logo, title, conversation starters (empty state)
Shell.Messages ─── renders message list
SimpelAssistantMessage ─── GenUIRenderer + SourceCards
ScrollRow ─── quick-select conversation starters (non-empty state)
Shell.Composer ─── text inputMessage Flow
- User types a message in the composer
ChatProvidercallsprocessMessagewhich sendsPOST /chat(or/rag-chatin help mode) to the backend with{ messages, threadId, format: "dsl", skills?, skillTags? }- Backend streams NDJSON lines in OpenAI
chat.completion.chunkformat convexAdapterparses the stream into headless events (TEXT_MESSAGE_START,TEXT_MESSAGE_CONTENT,TEXT_MESSAGE_END,TOOL_CALL_*)GenUIRendererpasses the DSL to the OpenUIRendererwhich maps it to shadcn/ui components in real time- Hallucinated container names (Stack, VStack, Box, etc.) are silently rewritten to
Cardbefore parsing - When the response finishes,
ThreadPersistencesaves the thread to localStorage
Thread Management
- Threads persist in
localStorageunder the key"simpel-threads" - Each thread stores: id, title, creation timestamp, full message history
- Titles are derived from the first user message (truncated to 60 chars)
- Users can create, switch, and delete threads from the dropdown
Component Library
The SDK includes 45+ components rendered from the LLM's DSL output. You don't use them directly — the LLM generates them based on OPENUI_SYSTEM_PROMPT.
| Group | Components | |-------|-----------| | Content | Card, CardHeader, TextContent, MarkDownRenderer, Alert, Badge, Avatar, CodeBlock, Image, ImageBlock, AnnotatedImage, SegmentedImage, Progress, Separator | | Tables | Table, Col | | Charts (2D) | BarChart, LineChart, AreaChart, RadarChart, Series | | Charts (1D) | PieChart, RadialChart, Slice | | Charts (Scatter) | ScatterChart, ScatterSeries, Point | | Forms | Form, FormControl, Label, Input, TextArea, Select, SelectItem, DatePicker, Slider, CheckBoxGroup, CheckBoxItem, RadioGroup, RadioItem, SwitchGroup, SwitchItem | | Buttons | Button, Buttons | | Follow-ups | FollowUpBlock, FollowUpItem | | Layout | Tabs, TabItem, Accordion, AccordionItem, Carousel | | Guides | StepBlock, Step | | Data Display | TagBlock, Tag | | Typography | Heading, Blockquote, InlineCode | | Navigation | PaginationBlock | | Overlays | DialogBlock, AlertDialogBlock, DrawerBlock | | Calendar | CalendarBlock |
Interactions
- Buttons trigger
ContinueConversation(sends a follow-up message) orOpenUrl(opens a link) - Forms validate inputs with rules (
required,email,min,maxLength,pattern) and submit via conversation - Follow-ups send pre-defined prompts as user messages
- Overlays (Dialog, Drawer, AlertDialog) contain nested component trees
- AnnotatedImage / SegmentedImage support interactive hover and click on detection results
Backend Requirements
The SDK expects a backend at NEXT_PUBLIC_SIMPEL_BACKEND_URL that:
Accepts POST to
/chat(and optionally/rag-chatfor help mode) with body:{ "messages": [{ "role": "user", "content": "..." }], "threadId": "optional-uuid", "format": "dsl", "skills": ["Skill Name"], "skillTags": ["tag"], "dataContext": "<data name=\"currentUser\" ...>...</data>" }The
dataContextfield is a pre-formatted string injected into the system prompt. It is sent with every request in the thread so the AI's responses stay grounded in the provider data.Uses
OPENUI_SYSTEM_PROMPTas the system message — this teaches the model every available component, its props, and the DSL syntax.Streams NDJSON (newline-delimited JSON, not SSE). Each line is an OpenAI
chat.completion.chunk:{"choices":[{"delta":{"content":"Card([..."},"finish_reason":null}]}Authenticates via
Authorization: Bearer <API_KEY>header.Supports tool calls (optional) — the adapter handles
delta.tool_callsand custom{ type: "tool_call" }events.
Theming
The SDK auto-detects OS preference (light/dark) via prefers-color-scheme and sets a data-theme attribute on <body>. The CSS uses OKLCH color space with shadcn/ui-compatible CSS variables.
If you need the theme provider separately:
import { ThemeProvider, useTheme } from "@simpel/sdk";
function MyApp() {
return (
<ThemeProvider>
<MyContent />
</ThemeProvider>
);
}
function MyContent() {
const theme = useTheme(); // "light" | "dark"
}Project Structure
simpel-sdk/
├── src/
│ ├── index.tsx # Public API — all exports
│ ├── system-prompt.ts # Generates OPENUI_SYSTEM_PROMPT from component library
│ ├── app/
│ │ ├── globals.css # Tailwind v4 + OKLCH theme variables
│ │ ├── layout.tsx # Next.js root layout
│ │ └── page.tsx # Demo page (FullScreen chat)
│ ├── components/
│ │ ├── simpel-chat.tsx # Main chat component (threads, modes, welcome screen)
│ │ ├── genui-renderer.tsx # Standalone GenUI DSL renderer
│ │ ├── markdown-view.tsx # Standalone markdown renderer
│ │ ├── simpel-renderer.tsx # Format switcher (genui or markdown)
│ │ └── ui/ # shadcn/ui primitives (accordion, card, table, etc.)
│ ├── hooks/
│ │ ├── use-system-theme.tsx # OS light/dark detection + ThemeProvider
│ │ ├── use-generate.ts # React hook for imperative streaming
│ │ └── use-skills.ts # React hook for skills CRUD
│ └── lib/
│ ├── generate.ts # Imperative streaming generator function
│ ├── tools.ts # Tool + DataProvider definitions, helpers, resolver
│ ├── skills.ts # Skills API client (fetch, create, update, delete)
│ ├── convex-adapter.ts # NDJSON stream parser for Convex backend
│ ├── thread-store.ts # localStorage thread persistence
│ └── shadcn-genui/ # OpenUI component library
│ ├── index.tsx # Library definition, component groups, examples, rules
│ ├── unions.ts # Zod unions for component child types
│ ├── action.ts # Button action schemas
│ ├── helpers.ts # Chart data transformers
│ ├── rules.ts # Form validation schemas
│ └── components/ # One file per GenUI component (39 files)
│ ├── alert.tsx
│ ├── accordion.tsx
│ ├── annotated-image.tsx
│ ├── avatar.tsx
│ ├── badge.tsx
│ ├── button.tsx
│ ├── buttons.tsx
│ ├── calendar-block.tsx
│ ├── card-header.tsx
│ ├── carousel.tsx
│ ├── charts.tsx # All chart types (Bar, Line, Area, Pie, Radar, Radial, Scatter)
│ ├── checkbox-group.tsx
│ ├── code-block.tsx
│ ├── date-picker.tsx
│ ├── dialog-block.tsx
│ ├── drawer-block.tsx
│ ├── follow-up-block.tsx
│ ├── form.tsx
│ ├── form-control.tsx
│ ├── image.tsx
│ ├── input.tsx
│ ├── label.tsx
│ ├── markdown-renderer.tsx
│ ├── pagination-block.tsx
│ ├── progress.tsx
│ ├── radio-group.tsx
│ ├── segmented-image.tsx
│ ├── select.tsx
│ ├── separator.tsx
│ ├── slider.tsx
│ ├── step-block.tsx
│ ├── switch-group.tsx
│ ├── table.tsx
│ ├── tabs.tsx
│ ├── tag.tsx
│ ├── text-content.tsx
│ ├── textarea.tsx
│ └── typography.tsx
├── package.json
└── tsconfig.jsonKey Files Explained
| File | Purpose |
|------|---------|
| src/index.tsx | Single entry point — re-exports everything consumers need |
| src/components/simpel-chat.tsx | The main SimpelChat component. Composes ChatProvider, thread management, welcome screen, mode toggle, and message rendering |
| src/components/genui-renderer.tsx | Wraps the OpenUI Renderer with container alias normalization (rewrites hallucinated names like Stack to Card) and default action handling |
| src/lib/generate.ts | generate() async generator that streams from the backend. Resolves data providers, runs tool loops, and streams responses |
| src/lib/tools.ts | defineTool(), defineDataProvider(), resolveDataProviders() — type definitions and helpers for tools and data providers |
| src/lib/convex-adapter.ts | Parses the backend's NDJSON stream into headless chat events. Handles both text content and tool call streaming |
| src/lib/thread-store.ts | Simple localStorage wrapper for thread CRUD. Derives thread titles from the first user message |
| src/lib/skills.ts | Skills API client + loadSkills() which returns a proxy-based client for typed skill access by camelCase name |
| src/lib/shadcn-genui/index.tsx | Defines the full component library: component registrations, component groups, DSL examples, and generation rules |
| src/lib/shadcn-genui/unions.ts | Zod union types that define which components can be children of other components |
| src/system-prompt.ts | Generates OPENUI_SYSTEM_PROMPT from the component library spec — this is what the LLM needs to produce valid DSL |
