@softheon/ai-assistant
v1.1.1
Published
Standalone AI assistant chat component for Angular applications
Maintainers
Keywords
Readme
@softheon/ai-assistant
A floating AI chat assistant for Angular applications that can answer questions, navigate pages, highlight elements, and trigger custom actions in your UI.
The library provides the chat UI, API call logic, and built-in actions. You provide the system prompt, live app state, and any custom actions specific to your domain. The library knows nothing about your app — all domain knowledge is supplied through a clean interface, making it droppable into any Angular application.
What it does
- Floating chat bubble — fixed-position, draggable, sits above your layout at z-index 9999
- Navigate — routes to any page in your app on the AI's instruction
- Highlight — draws a pulsing glow ring on any DOM element by CSS selector
- Click / ScrollTo — built-in actions for clicking elements and scrolling to them
- Custom actions — register handlers for any action type your app needs (open a modal, run a process, apply a filter, etc.)
- Proactive messages — push messages from your own services into the chat without the user typing anything
- Fully themeable — styled entirely via CSS custom properties; picks up your app's existing design tokens automatically
- Animated HAL avatar — a GSAP-powered character in the chat header that reacts to chat state, matches your accent colour, and flies to highlighted elements
Installation
npm install @softheon/ai-assistantPeer dependencies and assets
The HAL avatar SVG must be served by your application at /assets/hal/hal_bot.svg. Add this to your angular.json build options:
"assets": [
{
"glob": "**/*",
"input": "node_modules/@softheon/ai-assistant/src/assets",
"output": "/assets"
}
]The following must also be installed in your app:
npm install @ng-icons/core @ng-icons/phosphor-icons gsap@angular/common, @angular/core, @angular/router, @angular/forms, and rxjs are expected to already be present.
Quick Start
1. Create a context provider
This is the only required piece. It tells the library what your app is, what actions are available, and what's on screen right now.
// src/app/services/my-assistant-context.ts
import { Injectable } from '@angular/core';
import { Router } from '@angular/router';
import { AssistantContextProvider, AIAssistantConfig } from '@softheon/ai-assistant';
const SYSTEM_PROMPT = `You are an assistant for My App.
You help users navigate and find information.
## App Structure
### Dashboard (/dashboard)
- Shows summary and recent activity
### Orders (/orders)
- Lists all orders; each row has [data-order-id='VALUE']
## All Action Types
ALWAYS respond with raw JSON (no markdown, no code fences):
{ "emotion": "happy", "message": "...", "actions": [...] }
emotion must be exactly one of: success, happy, shock, evil
success — good news, task complete, nothing to do
happy — default; neutral or mildly positive
shock — large volume of work, overwhelming results
evil — errors, failures, bad news
navigate — { "type": "navigate", "route": "/orders", "label": "Go to Orders" }
highlight — { "type": "highlight", "selector": "[data-order-id='123']", "tooltip": "Order #123", "label": "Order #123" }
## Rules
- Only help with tasks in this app
- Never invent data values — only use IDs from the live context
- When returning multiple actions of the same type, ALWAYS include a descriptive "label" on each action using the entity name so the user can tell the buttons apart
- ALWAYS respond with raw JSON`;
@Injectable()
export class MyAssistantContext implements AssistantContextProvider {
constructor(private readonly router: Router) {}
getSystemPrompt(): string { return SYSTEM_PROMPT; }
buildLiveContext(): string {
return `\n## Live App State\nCurrent page: ${this.router.url}\n`;
}
getSuggestions(): string[] {
return ['Take me to orders', 'What can you do?'];
}
getAIConfig(): AIAssistantConfig {
return {
provider: 'openai-compatible',
baseUrl: 'https://your-api-host/api/v1/ai', // Armature backing API
};
}
getWelcomeMessage(): string { return "Hi! How can I help?"; }
getAppName(): string { return 'My App'; }
}2. Register with your module
NgModule:
import { AiAssistantModule } from '@softheon/ai-assistant';
import { MyAssistantContext } from './services/my-assistant-context';
@NgModule({
imports: [
AiAssistantModule.forRoot({ contextProvider: MyAssistantContext }),
],
})
export class AppModule {}Standalone app:
import { provideAiAssistant } from '@softheon/ai-assistant';
import { MyAssistantContext } from './services/my-assistant-context';
bootstrapApplication(AppComponent, {
providers: [...provideAiAssistant({ contextProvider: MyAssistantContext })],
});3. Add the component to your root layout
<router-outlet></router-outlet>
<ai-assistant></ai-assistant>That's it. The component is position: fixed so it requires no layout changes.
Secure API Proxy (Recommended for Production)
API keys must never be included in the Angular client — they would be visible in the browser bundle and network traffic. All requests go through a server-side proxy that injects the key.
A ready-to-use backing API is included in the Armature repository at AIAssistant/API/. It:
- Accepts requests from your Angular app (with CORS configured for your origins)
- Injects the configured API key before forwarding to the AI provider
- Enforces JWT bearer authentication
- Overrides the model with the server-configured value (clients cannot choose their own model)
- Streams the AI provider's response directly back
Client config when using the proxy:
getAIConfig(): AIAssistantConfig {
return {
provider: 'openai-compatible',
baseUrl: 'https://your-api-host/api/v1/ai', // proxy base URL — library appends /chat/completions
// apiKey omitted — the proxy handles it server-side
model: '', // server overrides this; send empty string
};
}Local development config (proxy running at its default port):
baseUrl: 'https://localhost:58057/api/v1/ai'CORS: Add your Angular app origin to CorsOrigins in appsettings.local.json on the API:
{ "CorsOrigins": ["http://localhost:4200"] }Knowledge Base (Optional)
Configure server-side markdown files as authoritative knowledge base sources under AIAssistant:KnowledgeBase:Sources in appsettings.json:
"AIAssistant": {
"KnowledgeBase": {
"Sources": [
{ "Path": "kb/faq.md", "Type": "qa", "Label": "FAQ" },
{ "Path": "kb/policy.md", "Type": "reference", "Label": "Policy" }
]
}
}Path is absolute or relative to the server's ContentRootPath. The Angular library fetches these automatically via GET /api/v{n}/ai/knowledge-base at init — no frontend configuration required. KB files are never served as public assets. KB-covered questions are answered regardless of topic restrictions in the system prompt — the formatted block includes response hierarchy instructions that give KB content priority.
Live Context
buildLiveContext() is called fresh before every message — it gives the AI its "eyes." Include the current route, the IDs of items currently on screen, and any relevant state. The more specific, the better.
buildLiveContext(): string {
const lines = ['\n\n## Live App State\n'];
lines.push(`Current page: ${this.router.url}`);
if (this._orders.length) {
lines.push(`\n### Orders on screen (${this._orders.length})`);
this._orders.forEach(o =>
lines.push(` [data-order-id='${o.id}'] "${o.id}" — ${o.status}`)
);
}
return lines.join('\n');
}Subscribe to services in the constructor and cache results in private fields — buildLiveContext() is synchronous.
constructor(private readonly orders: OrderService) {
this.orders.visible$.subscribe(o => this._orders = o);
}
private _orders: Order[] = [];Custom Actions
Register an AssistantActionHandler for anything beyond the four built-in actions (navigate, highlight, click, scrollTo):
import { Injectable } from '@angular/core';
import { AssistantActionHandler, AssistantAction } from '@softheon/ai-assistant';
@Injectable()
export class MyActionHandler implements AssistantActionHandler {
readonly supportedTypes = ['openOrder', 'exportCsv'];
constructor(private readonly orders: OrderService) {}
execute(action: AssistantAction): boolean {
switch (action['type']) {
case 'openOrder':
this.orders.select(action['orderId'] as string);
return true;
case 'exportCsv':
this.orders.exportCsv();
return true;
default:
return false;
}
}
}Register alongside your context provider:
AiAssistantModule.forRoot({
contextProvider: MyAssistantContext,
actionHandlers: [MyActionHandler],
})Document every custom action type in getSystemPrompt() — the AI will only use action types it has been told about:
openOrder — open an order's detail panel
{ "type": "openOrder", "orderId": "ORD-123", "label": "Open order" }Intercepting Messages (Tours & Local Commands)
Use interceptMessage() on your context provider to handle specific phrases without making an AI API call — great for guided tours, shortcuts, and easter eggs.
interceptMessage(text: string) {
if (text.toLowerCase().includes('tour')) {
return {
handled: true as const,
response: {
content: "Sure! Let me walk you through the app.",
actions: [{ type: 'startTour', tourId: 'main', label: 'Start tour' }],
},
};
}
return null; // proceed to AI as normal
}Return null for anything you don't handle and it flows to the AI normally.
data-* Attributes
For the AI to highlight or click a specific element (not just any element of a type), that element needs a data-* attribute, and its current value needs to appear in buildLiveContext().
Three things must all be present:
HTML template Live context output System prompt
──────────────────── ──────────────────── ─────────────────────
<div [data-order-id='ORD-1'] - Order row:
[attr.data-order-id] "ORD-1" — Pending [data-order-id='VALUE']
="order.id">Common attributes to add:
<!-- Repeating list items -->
<div *ngFor="let order of orders" [attr.data-order-id]="order.id">
<!-- Tabs -->
<button [attr.data-tab]="tab.id">{{ tab.label }}</button>
<!-- Filters -->
<button data-filter="pending">Pending</button>
<!-- Named action buttons -->
<button data-action="new-order">New Order</button>
<!-- Nav links -->
<a routerLink="/orders" data-nav="orders">Orders</a>Proactive Messages
Push messages into the chat from anywhere in your app — no user input needed:
import { Injectable } from '@angular/core';
import { NgZone } from '@angular/core';
import { AiAssistantService } from '@softheon/ai-assistant';
@Injectable({ providedIn: 'root' })
export class NotificationService {
constructor(
private readonly assistant: AiAssistantService,
private readonly zone: NgZone,
) {}
notifyOrderShipped(orderId: string): void {
// Use NgZone.run() when pushing from outside Angular's zone
// (SSE callbacks, WebSockets, setTimeout, etc.)
this.zone.run(() => {
this.assistant.pushMessage({
role: 'assistant',
content: `Order ${orderId} has shipped!`,
actions: [
{ type: 'navigate', route: '/orders', label: 'View Orders' },
{ type: 'highlight', selector: `[data-order-id='${orderId}']`, tooltip: 'This one' },
],
});
});
}
}HAL Avatar
The library includes an animated GSAP-powered avatar (HAL) with two rendering modes:
- Full body — floats in the top-left corner of the open chat drawer, reacts to all chat states
- Chat bubble (
chat_bubblemode) — head-only circle displayed as the floating chat button; clicking still opens the chat
Avatar state machine
| Trigger | Avatar state |
|---------|-------------|
| User sends a message | thinking — squinted eyes, thought bubble with animated dots |
| msg.loading flips to false (content arrives) | talking for 5 s |
| After talking | Emotion from JSON response for 3 s |
| After emotion | defaultFlight — gentle hover, random eye blinks |
| Chat open, 5 min no sends | sleep |
| Chat closed | sleep |
| Element highlighted (avatarFlyToHighlight: true) | Avatar flies beside element, bothArmsUp during flight, rightArmUp on arrival |
| Highlight cleared | Avatar flies back, resumes defaultFlight |
Avatar emotions from the AI response
Include "emotion" in every AI response by adding this to your system prompt:
ALWAYS respond with raw JSON (no markdown, no code fences):
{ "emotion": "happy", "message": "...", "actions": [...] }
emotion must be exactly one of:
success — good news, task complete, nothing to do
happy — default; neutral or mildly positive
shock — large volume of work, overwhelming results
evil — errors, failures, bad newsThe library validates the value — any unrecognised string falls back to "happy". The AvatarEmotion type is exported from the package if you need it in TypeScript:
import { AvatarEmotion } from '@softheon/ai-assistant';Disabling avatar fly-to-highlight
By default the avatar flies to hover beside any highlighted element. Disable this per-app via getAIConfig():
getAIConfig(): AIAssistantConfig {
return {
provider: 'openai-compatible',
baseUrl: '...',
avatarFlyToHighlight: false, // avatar stays in corner; still raises arm as a reaction
};
}Avatar colour
The avatar uses a hue-rotate approach anchored to the torso colour. Set accentColor in getAIConfig() and the entire avatar palette shifts to match that hue — no other configuration needed. Pass #1ab1b9 to get the original teal palette exactly.
Theming
The simplest way to match your app's brand colour is via getAIConfig():
getAIConfig(): AIAssistantConfig {
return {
provider: 'openai-compatible',
baseUrl: 'https://your-api-host/api/v1/ai',
accentColor: '#6366f1', // bubble, send button, user messages, focus rings, highlight tooltip
};
}accentColor applies to: the chat bubble, send button, user message bubbles, focus rings, and the highlight tooltip label. The tooltip is rendered directly on the highlighted element (via a CSS ::after pseudo-element), so the library sets --ai-accent inline on the element when highlighting and removes it when the highlight clears.
Alternatively, if your app already defines --primary as a CSS custom property the assistant picks it up automatically with no config change.
For full control over sizing or z-index, override CSS custom properties in your global stylesheet:
ai-assistant {
--ai-bubble-size: 3rem;
--ai-drawer-width: 24rem;
--ai-drawer-height: 36rem;
--ai-z-index: 9998; /* Overlay is +1 */
}If your app defines --background, --foreground, --border, --muted, --muted-foreground, or --input-background as CSS custom properties, the assistant picks those up automatically too.
Configuration Reference
AIAssistantConfig
| Property | Type | Default | Description |
|-------------------|----------|---------|-------------------------------------------------|
| provider | string | — | 'openai' or 'openai-compatible' |
| baseUrl | string | — | Backing API base URL (library appends /chat/completions) |
| model | string? | — | Model name. Omit when using the Armature backing API — the server enforces its own model. |
| maxTokens | number | 1200 | Max response length |
| temperature | number | 0.4 | Creativity (0 = consistent, 1 = varied) |
| maxHistoryDepth | number | 15 | Conversation turns kept in context |
| accentColor | string | — | Primary brand colour (hex, rgb, hsl). Falls back to the app's --primary CSS variable, then the built-in default. |
| phiScrubbing | object | on | PHI scrubbing config — see PHI scrubbing |
| debug | boolean | false | Log the full LLM request payload to the console before each call. Never enable in production. |
| avatarFlyToHighlight | boolean | true | When false, avatar stays anchored in the header and raises its arm instead of flying to highlighted elements. |
| showHighlightTooltips | boolean | false | Show a tooltip badge above highlighted elements. |
| resizable | boolean | false | Enable a drag handle at the top-left corner of the chat drawer for user-resizing. |
| showAccentBorder | boolean | true | Show an accent-colored glow ring around the open chat drawer. |
AssistantContextProvider interface
| Method | Required | Description |
|-----------------------|----------|----------------------------------------------------------|
| getSystemPrompt() | yes | Static AI instructions (identity, pages, actions, rules) |
| buildLiveContext() | yes | Dynamic snapshot of current app state |
| getSuggestions() | yes | Suggested questions shown in the empty chat state |
| getAIConfig() | yes | AI model configuration |
| getWelcomeMessage() | no | Message shown when chat opens |
| getAppName() | no | App name shown in the drawer subtitle |
| interceptMessage() | no | Short-circuit the AI for known commands (tours, etc.) |
AiAssistantService public methods
// Drawer
toggle() / open() / close()
// Chat
sendMessage(text: string): Promise<void>
pushMessage(msg): void
clearChat()
generateSuggestions(): Promise<string[]>
// Highlight
highlight(selector: string, tooltip: string): void
clearHighlight(): void
// Shift bubble when a side panel opens
setSettingsOpen(open: boolean): voidBuilt-in action types
| Type | Required fields | What it does |
|-------------|-----------------|--------------------------------------------|
| navigate | route | Router.navigateByUrl(route) |
| highlight | selector | CSS glow ring, auto-clears after 6 seconds |
| click | selector | element.click() |
| scrollTo | selector | element.scrollIntoView() + highlight |
AvatarEmotion
type AvatarEmotion = 'success' | 'happy' | 'shock' | 'evil';Returned as the emotion field in the AI's JSON response. Drives the HAL avatar animation before it switches to the talking preset. See HAL Avatar for full details.
ChatMessage
| Field | Type | Description |
|------------|------------------|--------------------------------------------------|
| id | string | Unique message ID |
| role | 'user' \| 'assistant' | Who sent the message |
| content | string | Message text |
| actions | AssistantAction[]? | Action buttons rendered below the message |
| loading | boolean? | When true, shows loading dots instead of content |
| emotion | AvatarEmotion? | Emotion signalled by the AI; drives avatar state |
