@rahul_vendure/ai-chat-plugin
v0.1.8
Published
AI-powered shopping assistant plugin for Vendure — LLM chat, vector search, cart management, and order tracking
Maintainers
Readme
@rahul_vendure/ai-chat-plugin
AI-powered shopping assistant plugin for Vendure e-commerce. Adds an LLM-driven chat endpoint to your Vendure server with product search, vector search, cart management, order tracking, and checkout assistance.
Features
- LLM Chat — GPT-4o-mini powered shopping assistant via Vercel AI SDK
- 14 Built-in Tools — search, vector search, collections, price filter, add-to-cart, cart view/adjust, order history/tracking, checkout (addresses, shipping)
- Streaming — Real-time streaming responses via
POST /ai-assistant/stream(compatible withuseChat()from@ai-sdk/react) - Non-streaming — JSON response via
POST /ai-assistant/chat - Vector Search — pgvector-powered semantic search over products and collections
- Auto-Embedding — Automatic embedding generation on product/variant/collection create/update/delete
- Bulk Embedding — Admin GraphQL mutation
triggerEmbedAllto embed all existing catalog data - Embedding Entities — Registers TypeORM entities for vector storage (requires pgvector and a migration)
Requirements
- Vendure
>=3.0.0 - PostgreSQL with pgvector extension
- OpenAI API key
Installation
npm install @rahul_vendure/ai-chat-pluginPrerequisites
- Install pgvector — The pgvector extension must be installed in your PostgreSQL instance. Enable it by running:
CREATE EXTENSION IF NOT EXISTS vector; - Run migrations — The plugin registers the entities with Vendure, but you must generate and run a database migration to create the tables and vector columns:
npx vendure migrate
Setup
1. Add the plugin to your Vendure config
// vendure-config.ts
import { AiAssistantPlugin } from '@rahul_vendure/ai-chat-plugin';
export const config: VendureConfig = {
// CORS — allow your storefront origin so the browser can call /ai-assistant/stream directly
apiOptions: {
cors: {
origin: ['http://localhost:3001', 'https://my-storefront.com'],
},
},
plugins: [
// ... other plugins
AiAssistantPlugin.init({
openaiApiKey: process.env.OPENAI_API_KEY!,
embeddingModel: 'text-embedding-3-small', // optional, this is the default
assetBaseUrl: 'https://my-vendure.com/assets/', // optional, auto-detected in dev
}),
],
};2. Embed your catalog
After starting the server, run the triggerEmbedAll mutation from the Admin API (GraphiQL at /admin-api):
mutation {
triggerEmbedAll {
jobId
message
}
}This generates vector embeddings for all products, variants, and collections in the background. You only need to run this once — after that, embeddings are automatically kept in sync on create/update/delete.
3. Add the frontend
Use the companion package @rahul_vendure/ai-chat-react for a drop-in React chat UI:
npm install @rahul_vendure/ai-chat-react ai @ai-sdk/react<VendureAiChat vendureUrl="https://my-vendure.com" />See the @rahul_vendure/ai-chat-react README for full docs.
Configuration Options
| Option | Type | Required | Default | Description |
|--------|------|----------|---------|-------------|
| openaiApiKey | string | Yes | — | Your OpenAI API key |
| embeddingModel | string | No | 'text-embedding-3-small' | OpenAI embedding model to use |
| assetBaseUrl | string | No | Auto-detected | Base URL for product images (e.g. https://my-vendure.com/assets/) |
API Endpoints
POST /ai-assistant/stream
Streaming endpoint compatible with the Vercel AI SDK useChat() hook. Returns a UI Message Stream response.
The @rahul_vendure/ai-chat-react package connects to this endpoint automatically — you don't need to call it manually.
Request:
{
"message": "Show me laptops under $1000",
"history": [
{ "role": "user", "content": "Hi" },
{ "role": "assistant", "content": "Hello! How can I help?" }
]
}Auth: Pass a Bearer token in the Authorization header for logged-in user features (cart, orders).
POST /ai-assistant/chat
Non-streaming endpoint. Returns the full response as JSON after all tool calls complete.
Response:
{
"message": "Here are some laptops I found!",
"products": [
{ "id": "1", "name": "Gaming Laptop", "slug": "gaming-laptop", "price": 89900, "image": "https://...", "variantId": "42" }
],
"collections": [],
"addToCartAction": null,
"activeOrder": null
}CORS
The React frontend connects directly to the Vendure server from the browser (no proxy needed). You must configure CORS to allow your storefront origin:
export const config: VendureConfig = {
apiOptions: {
cors: {
origin: ['https://my-storefront.com'],
},
},
};In development, Vendure allows all origins by default, so no CORS config is needed for
localhost.
Tools
The AI assistant has 19 built-in tools:
| Tool | Description |
|------|-------------|
| searchProducts | Keyword search using Vendure's fulltext search index |
| vectorSearch | Semantic/intent-based search using pgvector embeddings |
| getCollections | List product collections/categories |
| filterByPrice | Filter products by price range |
| addToCart | Add a product variant to the customer's cart |
| getActiveOrder | Get the current cart with all line items |
| adjustOrderLine | Change quantity or remove items from cart |
| getOrderHistory | List past orders for the customer |
| trackOrder | Get order status and tracking information |
| getAvailableCountries | List countries for shipping/billing |
| getEligibleShippingMethods | Get shipping options for active order |
| getEligiblePaymentMethods | Get available payment methods for active order |
| setShippingAddress | Set shipping address on active order |
| setBillingAddress | Set billing address on active order |
| setShippingMethod | Set shipping method on active order |
| getAvailablePromotions | List active promotions and deals (automatic + coupon-based) |
| applyCouponCode | Apply a coupon/promo code to the active order |
| removeCouponCode | Remove a coupon code from the active order |
| getCartDiscounts | View cart discounts and how close the customer is to qualifying for promotions |
Entities
The plugin adds three entities to your database:
product_embedding— Vector embeddings for productsvariant_embedding— Vector embeddings for product variantscollection_embedding— Vector embeddings for collections
Admin Dashboard AI Chat
The plugin also includes a built-in admin chat endpoint at POST /admin-ai-chat/chat with admin-specific tools:
| Tool | Description |
|------|-------------|
| searchProducts | Keyword search with admin details (stock levels, enabled status, variant count) |
| vectorSearch | Semantic search for products by intent |
| getProductDetails | Full product details with all variants, SKUs, stock |
| searchOrders | Search orders by code, customer, or state |
| searchCustomers | Search customers by name/email with order stats |
| getCollections | List/search collections with product counts |
To add the admin chat UI to your Vendure dashboard, install the companion package:
npm install @rahul_vendure/ai-chat-dashboardSee the @rahul_vendure/ai-chat-dashboard README for setup instructions.
Exported Services
For advanced use cases, you can inject these services in your own plugins:
import {
AiChatService, // Storefront chat service
AdminAiChatService, // Admin dashboard chat service
EmbeddingService,
ProductEmbeddingService,
VariantEmbeddingService,
CollectionEmbeddingService,
} from '@rahul_vendure/ai-chat-plugin';License
AGPL-3.0 — see LICENSE for details.
