npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

pim-import

v6.13.1

Published

TypeScript/Node.js library to bulk-import content from a **PIM** (Product Information Management) system into **Contentful CMS**, with integrations for **Amazon S3**, **Algolia** search, and **Imgix** image management.

Readme

pim-import

TypeScript/Node.js library to bulk-import content from a PIM (Product Information Management) system into Contentful CMS, with integrations for Amazon S3, Algolia search, and Imgix image management.

Services

| Service | Purpose | | ---------- | -------------------------------------------- | | PIM | Data source (products, catalogs, dictionary) | | Contentful | CMS destination (CMA + CDA) | | Amazon S3 | Asset and JSON snapshot storage | | Algolia | Search index synchronization | | Imgix | Image asset management |


Installation

yarn add pim-import

Configuration

All services are initialized programmatically or via environment variables (prefixed FPI_).

PIM

import { initPim } from "pim-import";

initPim({
  baseURL: process.env.FPI_PIM_BASE_URL, // required
  username: process.env.FPI_PIM_USERNAME,
  password: process.env.FPI_PIM_PASSWORD,
  timeout: 300, // seconds, default 300
  sslverify: true, // default true
});

Contentful

import { initContentful } from "pim-import";

initContentful({
  accessToken: process.env.FPI_CTF_CMA_ACCESS_TOKEN, // required
  spaceId: process.env.FPI_CTF_SPACE_ID, // required
  environmentId: process.env.FPI_CTF_ENVIRONMENT, // required
});

Amazon S3

import { initS3 } from "pim-import";

initS3({
  accessKeyId: process.env.FPI_AWS_ACCESS_KEY, // required
  secretAccessKey: process.env.FPI_AWS_SECRET_ACCESS_KEY, // required
  region: process.env.FPI_AWS_REGION, // required
  bucket: process.env.FPI_AWS_S3_BUCKET, // required
});

Environment variables reference

# PIM
FPI_PIM_BASE_URL=
FPI_PIM_USERNAME=
FPI_PIM_PASSWORD=
FPI_PIM_TIMEOUT=300
FPI_PIM_SSL_VERIFY=true

# Contentful
FPI_CTF_CMA_ACCESS_TOKEN=
FPI_CTF_SPACE_ID=
FPI_CTF_ENVIRONMENT=

# Amazon S3
FPI_AWS_ACCESS_KEY=
FPI_AWS_SECRET_ACCESS_KEY=
FPI_AWS_REGION=
FPI_AWS_S3_BUCKET=

# Algolia
FPI_ALGOLIA_APP_ID=
FPI_ALGOLIA_API_KEY=
FPI_ALGOLIA_INDEX_NAME_SUFFIX=
FPI_ALGOLIA_INDEX_PRODUCTS_NAME=

# PDF generation (AWS Lambda)
FPID_AWS_GEN_PDF_LAMBDA_URL=
FPI_TRIGGER_GEN_PDF_TASK_ID=
FPI_TRIGGER_SECRET_KEY=

# Notifications
FPI_TEAMS_NOTIFICATIONS_URL=   # Microsoft Teams incoming webhook

# Monitoring
SENTRY_DSN=
SENTRY_TRACE_SAMPLE_RATE=
LOGDNA_APP_NAME=
LOGDNA_KEY=

Import flow

A complete import follows these steps in order:

  1. Dictionary
  2. Taxonomies (categories, families, sub-families)
  3. Products
  4. Product relationships
  5. Audit
  6. Algolia reindex

1. Dictionary

Imports product field translations and icons from the /dictionary PIM endpoint.

Import field translations

Saves localized labels for all product fields to Contentful.

Required: FPI_PIM_BASE_URL, FPI_CTF_CMA_ACCESS_TOKEN, FPI_CTF_SPACE_ID, FPI_CTF_ENVIRONMENT

import { importDictionaryFields } from "pim-import";

await importDictionaryFields();

Import icons to S3

Uploads product field icons retrieved from PIM to Amazon S3.

Required: FPI_PIM_BASE_URL, FPI_AWS_*

import { importDictionaryIcons } from "pim-import";

await importDictionaryIcons();

Import product line taxonomy

import {
  importDictionaryProductLine,
  importDictionaryProductSubLine,
} from "pim-import";

await importDictionaryProductLine();
await importDictionaryProductSubLine();

Import all dictionary data at once

import { importDictionaryData } from "pim-import";

await importDictionaryData();

2. Taxonomies

Imports the catalog hierarchy from the /catalogs PIM endpoint.

Each catalog operation writes the following Contentful content types:

  • topicCategory — category with name, code, catalog, families
  • topicSubFamily — sub-family with name, code, catalog
  • pageContent + page — CMS pages for each category and sub-family

Import categories

import { importCategories } from "pim-import";

const result = await importCategories(
  "ARCHITECTURAL", // AvailableCatalogs: 'ARCHITECTURAL' | 'OUTDOOR' | 'DECORATIVE'
  0, // offset (default 0)
  -1, // limit, -1 = all (default -1)
  "", // optional S3 path to a pre-fetched JSON snapshot
);
// result: { completed: boolean, offset, limit, total, s3FilePath }

Import families and sub-families

import { importFamilies, importSubFamilies } from "pim-import";

await importFamilies("ARCHITECTURAL", 0, -1);
await importSubFamilies("ARCHITECTURAL", 0, -1);

Import models and sub-models

import { importModels, importSubModels } from "pim-import";

await importModels("ARCHITECTURAL", 0, -1);
await importSubModels("ARCHITECTURAL", 0, -1);

3. Products

Import by last modification date

Fetches products from the /latest-products endpoint and creates/updates Contentful entries.

Each product creates:

  • topicProduct — product with name, code, status, categories, sub-families, product line, PIM details
  • pageContent — page content wrapper
  • page — public product page with slug
import { importLatestProducts } from "pim-import";

const result = await importLatestProducts(
  "ARCHITECTURAL", // catalog
  "20240101T00:00:00", // lastModified — ISO date used as lower bound
  0, // page (default 0)
  50, // size per page (default 50)
  "20240201T00:00:00", // optional lastModifiedTo upper bound
);
// result: { completed, page, size, totalPages, nextPage }

Import a single product by code

import { importProductByCode } from "pim-import";

const entry = await importProductByCode("F3030031", "ARCHITECTURAL");

Reimport products flagged by audit

import { reimportAuditProducts } from "pim-import";

await reimportAuditProducts();

4. Product relationships

Set relationships (color variants, accessories)

import { setProductsRelationships, setProductRelationships } from "pim-import";

// All products in a catalog
await setProductsRelationships("ARCHITECTURAL");

// Single product
await setProductRelationships("F3030031");

Remove relationships

import {
  removeProductFromColorVariantsByProductLine,
  removeAllProductModelProductRelations,
} from "pim-import";

await removeProductFromColorVariantsByProductLine();
await removeAllProductModelProductRelations();

Populate destination fields

import { populateDestinations } from "pim-import";

await populateDestinations();

5. Audit

Validates product data integrity across Contentful entries.

import { audit } from "pim-import";

const results = await audit("ARCHITECTURAL");
// results: AuditResults — per-product validation status and missing fields

6. Algolia reindex

All reindex functions accept an optional catalog filter. Individual variants accept a code string.

Products

import {
  reindexProducts,
  reindexProduct,
  removeProductObject,
} from "pim-import";

await reindexProducts("ARCHITECTURAL");
await reindexProduct("F3030031");
await removeProductObject("F3030031");

Families and sub-families

import {
  reindexFamilies,
  reindexFamily,
  removeFamilyObject,
  reindexSubFamilies,
  reindexSubFamily,
  removeSubFamilyObject,
} from "pim-import";

await reindexFamilies("ARCHITECTURAL");
await reindexSubFamilies("ARCHITECTURAL");

Models and sub-models

import {
  reindexModels,
  reindexModel,
  removeModelObject,
  reindexSubModels,
  reindexSubModel,
  removeSubModelObject,
} from "pim-import";

await reindexModels();
await reindexSubModels();

Editorial content

import {
  reindexDownloads,
  reindexInspirations,
  reindexProjects,
  reindexStories,
  reindexPressReleases,
  reindexPressReviews,
  reindexPosts,
} from "pim-import";

await reindexDownloads();
await reindexInspirations();
await reindexProjects();
await reindexStories();
await reindexPressReleases();
await reindexPressReviews();
await reindexPosts();

Clone index settings

import { cloneIndexSettings } from "pim-import";

await cloneIndexSettings("products_en", "products_en_staging");

Amazon S3 utilities

import { uploadS3, saveJsonToS3, getFileFromS3, savePDFToS3 } from "pim-import";

// Upload a remote file by URL
const { Location } = await uploadS3(
  "https://example.com/image.jpg",
  "image.jpg",
  "assets/",
);

// Save a JSON object
const s3Path = await saveJsonToS3({ key: "value" }, "data.json", "snapshots/");

// Retrieve a file (returns content as string)
const content = await getFileFromS3("snapshots/data.json");

// Save a PDF buffer
await savePDFToS3(pdfBuffer, "product.pdf", "pdf/");

PDF generation

Generates a technical specification PDF via AWS Lambda and returns a URL.

import { generateTechSpecPdf, generatePDFByUrl } from "pim-import";

// From a product code
const url = await generateTechSpecPdf("F3030031");

// From an arbitrary URL
const buffer = await generatePDFByUrl("https://example.com/page", "output.pdf");

Downloads

Imports download resources from a CSV file.

import { importDownloads } from "pim-import";

await importDownloads("/path/to/downloads.csv");

SEO Import

Imports SEO metadata (meta title and meta description) from a CSV file stored on S3 into Contentful topicSeo entries.

The function parses the slug column to extract the locale and the actual slug value. The locale is the first path segment; the slug is the last segment. For example, de/global/bespoke/ means locale=de and slug=bespoke, so it searches for a page entry with fields.slug.de = "bespoke".

CSV format

| Column | Description | | ----------------- | ------------------------------------------------------------------------------------- | | slug | Localized slug path (e.g., de/global/bespoke/ or en-US/us/projects/project-name/) | | metaTitle | Meta title for the detected locale | | metaDescription | Meta description for the detected locale |

Example CSV

slug,metaTitle,metaDescription
de/global/bespoke/,Bespoke - German,Die Meta-Beschreibung für die Bespoke-Seite
/en-US/us/projects/projects-bodegas-faustino/,Bodegas Faustino,The meta description for Bodegas Faustino

See src/pim/methods/seo-example.csv for a complete example.

Usage

import { importSeoFromCsv } from "pim-import";

const result = await importSeoFromCsv(
  "path/to/seo-data.csv", // S3 path (required)
  0, // offset (default 0)
  50, // limit, -1 = all (default 50)
);
// result: {
//   offset: number,
//   limit: number,
//   completed: boolean,
//   total: number,
//   processed: number,
//   created: number,
//   updated: number,
//   skipped: number,
//   errors: string[]
// }

Behavior

  • Slug parsing: Extracts locale (first segment) and slug (last segment) from the path
  • Slug fallback: If a page does not have the slug translated for the requested locale (i.e., fields.slug.{locale} does not exist), falls back to searching with fields.slug.en
  • Invalid format: Logs a warning and skips rows with malformed slug paths
  • Unknown locale: Logs a warning and skips rows with unsupported locales
  • Page not found: Logs a warning and skips the row
  • topicSeo exists: Updates only metaTitle and metaDescription for the detected locale; other locales and fields remain unchanged
  • topicSeo not found: Creates a new topicSeo entry, populates metaTitle/metaDescription for the detected locale, and associates it to the page's seo field
  • Empty CSV value: Retains the existing value in Contentful (never overwrites with empty)
  • Published automatically: New and updated topicSeo entries are published immediately
  • Errors: Logged but processing continues for other rows

Contentful utilities

import {
  initBaseEntries,
  getEntryByID,
  getEntries,
  getTopicPage,
  getAllProductEntriesByCatalog,
  publishAllProductDrafts,
  deleteEntries,
  deletePages,
  migrateEntryFields,
  checkTopicDraftAndPagePublished,
} from "pim-import";

// Publish all draft products
await publishAllProductDrafts();

// Get all product entries for a catalog
const entries = await getAllProductEntriesByCatalog("ARCHITECTURAL");

// Migrate fields between content types
await migrateEntryFields("topicProduct", { oldField: "newField" });

// Check publish status
const isReady = await checkTopicDraftAndPagePublished("topic-id-123");

Designers

import { importDesigners, importDesigner } from "pim-import";

await importDesigners();
const entry = await importDesigner("designer-code");

Auto-descriptions

import {
  setProductsAutodescription,
  getProductAutodescription,
  setProductAutodescriptionByTopicId,
} from "pim-import";

await setProductsAutodescription();
const text = await getProductAutodescription("F3030031");
await setProductAutodescriptionByTopicId("topic-id-123");

Logging

import {
  log,
  setLogId,
  setLogPath,
  setLogFilename,
  getLogFolder,
} from "pim-import";

setLogPath("/var/log/pim-import");
setLogFilename("import-run");
setLogId("session-001");

log("Import started", "INFO");
// Levels: VERBOSE | DEBUG | INFO | WARN | ERROR | SILLY | HTTP

Notifications & CI

import { notify, netlifyBuild } from "pim-import";

// Send a message to a Microsoft Teams channel (via FPI_TEAMS_NOTIFICATIONS_URL)
await notify("Import completed successfully", true);

// Trigger a Netlify build
await netlifyBuild("site-id", "build-hook-id");

Available catalogs

type AvailableCatalogs = "ARCHITECTURAL" | "OUTDOOR" | "DECORATIVE";

License

MIT — atoms.studio