@byte5digital/meilisearch-pro
v0.0.8
Published
A fully customizable meilisearch plugin for all of your entities in medusa
Readme
Meilisearch Pro for Medusa v2
A fully customizable, production-ready Meilisearch plugin for Medusa v2 that provides enterprise-grade search capabilities with advanced configuration options.
Why Choose Meilisearch Pro?
Universal Entity Support
Any Medusa entity can easily be configured and made searchable - from products and collections to orders, customers, and custom entities. No more limitations on what you can search.
Superior to Other Meilisearch Plugins
- Flexible Index Configuration: Support for multiple indexes with custom settings per entity
- Geo-location Support: Built-in geospatial search capabilities
- Custom Transformers: Transform data before indexing with custom functions
- Workflow Integration: Seamless integration with Medusa's workflow system
Key Benefits
- Universal Entity Support: Make any Medusa entity searchable with simple configuration
- Zero Configuration: Works out of the box with sensible defaults
- Highly Customizable: Configure every aspect of your search experience
- Performance Optimized: Efficient indexing and search operations
- Production Tested: Built and tested in real-world e-commerce environments
Installation
yarn add @byte5digital/meilisearch-proConfiguration
Basic Setup
// medusa-config.js
import { MeilisearchProPluginOptions } from '@byte5digital/meilisearch-pro'
const plugins: MeilisearchProPluginOptions[] = [
{
resolve: '@byte5digital/meilisearch-pro',
options: {
config: {
host: 'http://localhost:7700',
apiKey: 'your-meilisearch-api-key',
},
settings: [
{
indexName: 'products',
module: 'product',
enabled: true,
indexSettings: {
searchableAttributes: ['title', 'description', 'tags'],
filterableAttributes: ['category_id', 'price', 'status'],
sortableAttributes: ['price', 'created_at'],
},
},
],
},
},
]
export default {
plugins,
}Automatic Sync Scheduling
Control how and when your Meilisearch indexes are automatically synced:
// Default: Sync every hour
{
config: { /* ... */ },
autoSync: "0 * * * *", // Default value (optional to specify)
settings: [ /* ... */ ]
}
// Custom schedule: Daily at midnight
{
config: { /* ... */ },
autoSync: "0 0 * * *",
settings: [ /* ... */ ]
}
// Disable automatic sync (manual trigger only)
{
config: { /* ... */ },
autoSync: false,
settings: [ /* ... */ ]
}Cron Schedule Format
The autoSync option accepts standard cron expressions:
minute hour day month weekday
(0-59) (0-23) (1-31) (1-12) (0-7)Common Examples:
"0 * * * *"- Every hour (default)"0 0 * * *"- Daily at midnight"0 */6 * * *"- Every 6 hours"0 0 * * 0"- Weekly on Sunday at midnight"0 0 1 * *"- Monthly on the 1st at midnightfalse- Disable automatic sync
Important Notes:
- When
autoSync: false, the job remains manually triggerable via workflows - Changes to
autoSyncrequire a server restart to take effect - Invalid cron expressions will be caught during validation with helpful error messages
Advanced Configuration
// Advanced configuration with geo-location and custom transformers
const advancedConfig: MeilisearchProPluginOptions = {
resolve: '@byte5digital/meilisearch-pro',
options: {
config: {
host: 'https://your-meilisearch-instance.com',
apiKey: 'your-api-key',
},
deleteUnusedIndexes: true,
disableSearchRoute: true,
autoSync: "0 0 * * *", // Daily sync at midnight
settings: [
{
indexName: 'products',
module: 'product',
enabled: true,
fields: ['id', 'title', 'description', 'price', 'category_id'],
indexSettings: {
searchableAttributes: ['title', 'description', 'tags'],
filterableAttributes: ['category_id', 'price', 'status'],
sortableAttributes: ['price', 'created_at'],
},
useGeoLocation: {
latitude: 'metadata.latitude',
longitude: 'metadata.longitude',
},
batchSize: 50000,
enableBatching: true,
batchDelayMs: 0,
transformer: (product) => ({
...product,
searchableText: `${product.title} ${product.description}`,
}),
},
],
},
}Features
Universal Entity Support
Make any Medusa entity searchable with simple configuration:
settings: [
{
indexName: 'products',
module: 'product',
// Product search configuration
},
{
indexName: 'orders',
module: 'order',
// Order search configuration
},
{
indexName: 'customers',
module: 'customer',
// Customer search configuration
},
{
indexName: 'collections',
module: 'product_collection',
// Collection search configuration
},
{
indexName: 'custom_entities',
module: 'your_custom_module',
// Custom entity search configuration
},
]Geo-location Search
Enable location-based search for your products:
useGeoLocation: {
latitude: 'metadata.latitude',
longitude: 'metadata.longitude',
}Custom Data Transformers
Transform your data before indexing:
transformer: (product) => ({
...product,
searchableText: `${product.title} ${product.description}`,
priceRange: product.price < 50 ? 'budget' : 'premium',
})Automatic medusa_id Field
All documents indexed in Meilisearch automatically include a medusa_id field that preserves the original Medusa entity ID. This is crucial for:
- Orphan Detection: Ensures accurate cleanup of deleted entities during sync operations
- Custom ID Strategies: Allows transformers to modify the document
id(primary key) while maintaining a reference to the original entity - Data Integrity: Provides a reliable link back to the source entity in Medusa
How It Works
The medusa_id field is automatically added AFTER your transformer runs, ensuring it always exists and cannot be accidentally removed or overwritten. The processing order is:
- Original document from Medusa (contains
id) - Geo-location processing (if enabled)
- Your transformer runs (can modify any field including
id) medusa_idis automatically set to the originalidvalue
This means you can safely modify or completely change the id field in your transformer without losing the reference to the original Medusa entity.
Example: Custom Composite IDs
// Your transformer can freely change the document ID
transformer: (product) => ({
...product,
id: `prod_${product.id}_${product.variant_id}`, // Custom composite ID
// medusa_id will automatically be set to the original product.id
// You don't need to (and cannot) set medusa_id yourself
})Important Notes
- ⚠️ Do not set
medusa_idin your transformer - It will be overwritten automatically - ⚠️ The
medusa_idvalue is always the original entityidfrom Medusa, captured before any transformations - ✅ Feel free to modify the
idfield in your transformer -medusa_idpreserves the original value - ✅ Automatically added to
displayedAttributes- If you specifydisplayedAttributesin your index settings,medusa_idwill be automatically included
Automatic Display Attributes
When you configure displayedAttributes in your index settings, medusa_id is automatically added to ensure it's always available for orphan detection and data integrity:
indexSettings: {
// You specify these attributes
displayedAttributes: ['id', 'title', 'description', 'price'],
// medusa_id is automatically added behind the scenes
// Actual result: ['id', 'title', 'description', 'price', 'medusa_id']
}If you don't specify displayedAttributes, all fields are displayed by default (Meilisearch's default behavior), including medusa_id.
Filtering by Medusa ID
Add medusa_id to your filterableAttributes if you need to filter or search by the original Medusa entity ID:
indexSettings: {
filterableAttributes: ['medusa_id', 'category_id', 'status'],
displayedAttributes: ['id', 'title', 'price'], // medusa_id automatically added
}Batch Processing Mode
Configure how entities are synced from your database to Meilisearch with three interconnected settings:
enableBatching: true // Default: true
batchSize: 5000 // Default: 100000
batchDelayMs: 0 // Default: 0How These Settings Work Together
batchSize - Controls batch size for database queries:
- With batching enabled (
enableBatching: true): Determines how many entities to fetch per batch during pagination - With batching disabled (
enableBatching: false): This setting is ignored; all entities are fetched in one query - Lower values reduce memory usage, higher values can improve performance
- Example:
batchSize: 1000means fetch 1000 entities per batch
enableBatching - Enables/disables batch processing mode:
true(default): Processes entities in batches ofbatchSize, loops through all data with paginationfalse: Fetches ALL entities in a single database query (ignoresbatchSize)
batchDelayMs - Adds delay between batches (only applies when enableBatching: true):
- Adds a pause between processing each batch
- Useful for rate limiting and reducing load on database/Meilisearch
- Example:
batchDelayMs: 100adds 100ms delay between batches
Batching Mode vs Non-Batching Mode
Batching Mode (enableBatching: true):
- ✅ Processes entities in batches of
batchSize - ✅ Automatically loops through ALL entities using pagination
- ✅ Tracks entity IDs across all batches
- ✅ Deletes orphaned entities only after all batches complete
- ✅ Memory-efficient for large datasets
- ✅ Supports rate limiting via
batchDelayMs - ✅ Best for: Datasets > 10k entities, production environments
Non-Batching Mode (enableBatching: false):
- ⚡ Fetches ALL entities in a single database query
- ⚡ Faster for small datasets (< 10k entities)
- ⚠️ Higher memory usage
- ⚠️ No pagination or rate limiting
- ⚠️
batchSizeis ignored - ⚠️ Best for: Small datasets only (< 10k entities)
Configuration Examples
Small dataset (< 10k entities) - Fast sync:
{
enableBatching: false // Fetch everything at once
// batchSize is ignored when batching is disabled
}Medium dataset (10k-100k) - Memory efficient:
{
enableBatching: true,
batchSize: 5000, // Process 5000 entities per batch
batchDelayMs: 0 // No delay between batches
}Large dataset (100k-1M) - Optimized batching:
{
enableBatching: true,
batchSize: 10000, // Larger batches for better performance
batchDelayMs: 0
}Rate-limited environment - Protect your infrastructure:
{
enableBatching: true,
batchSize: 1000, // Smaller batches
batchDelayMs: 100 // 100ms pause between batches
}Recommended Settings by Dataset Size
| Dataset Size | enableBatching | batchSize | batchDelayMs | Use Case | |--------------|----------------|------------|--------------|-----------------------------| | < 10k | false | (ignored) | 0 | Small datasets, fast sync | | 10k - 100k | true | 1000-5000 | 0 | Medium datasets | | 100k - 1M | true | 5000-10000 | 0 | Large datasets | | > 1M | true | 10000 | 0-100 | Very large datasets | | Rate-limited | true | 500-1000 | 100-200 | Constrained environments |
Important Notes
⚠️ Configuration Placement: These settings must be at the same level as indexSettings, not inside it:
{
indexName: 'products',
module: 'product',
// ✅ Correct: At root level
enableBatching: true,
batchSize: 5000,
batchDelayMs: 100,
// Meilisearch-specific settings
indexSettings: {
searchableAttributes: ['title', 'description'],
filterableAttributes: ['category_id', 'price'],
// ❌ Don't put enableBatching, batchSize, batchDelayMs here
}
}Workflow Integration
Sync your indexes using Medusa's workflow system:
import { syncMeilisearchProIndexesWorkflow } from '@byte5digital/meilisearch-pro'
// Trigger sync workflow
await syncMeilisearchProIndexesWorkflow({
indexes: [
{ module: 'product', indexName: 'products' },
],
})Development
Prerequisites
- Node.js >= 20
- Medusa v2
- Meilisearch instance
Local Development
For instructions on local development and testing, please refer to the Medusa plugin development documentation.
About byte5
We're a development company based in Frankfurt, Germany — remote-friendly, open-minded, and tech-driven. Our team brings deep expertise in Node.js, MedusaJS, Laravel, Umbraco, and decentralized tech like IOTA. We collaborate with clients who care about clean code, scalable solutions, and long-term maintainability.
We contribute to open source, run Laravel DACH Meetups, and support developer communities across the DACH region. Our expertise in e-commerce platforms makes us the perfect partner for building robust, scalable solutions.
If you love building smart solutions with real impact — we should talk.
Connect with us:
Support
Built with 🩵 by byte5
