npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

crash-ai

v1.0.8

Published

Ai By Fauzi

Downloads

723

Readme

Crash-AI: Agent RAG

Crash-AI is a library specifically designed for building RAG (Retrieval-Augmented Generation)-based bot agents. This library simplifies integration between OpenAI, PostgreSQL Vector, Langchain, and website data scraping for automated bot needs.


Features

  • Ingestor Scrapping: Extract data from business websites directly into the Vector database.
  • Ingestor Document: Extract data from business documents directly into the Vector database.
  • Smart RAG: Search for relevant information based on a knowledge database.
  • Multi-Agent Routing: Intelligently redirect user queries to the appropriate agent.
  • Metadata Mapping: Automatically map product images based on keywords in the content.
  • Models Context Protocol: Create your own MCP (Models Context Protocol) that can connect to multiple servers that produce tools for execution.
  • Agent: Create your own agent that can execute MCP tools independently.
  • History Cache: Automatically map history using Redis.

Diagaram

Alur Kerja Bot

Install

Node v.22 ++

npm i crash-ai

Embbedings Content Web

import "dotenv/config";
import { ingestWeb } from "crash-ai";
import pg from "../Config/PgDB.js";

const ApiKey = process.env.OpenKey;
//Master Mapping Image Label dan Keywords
const { rows: masters } = await pg.query(
  "SELECT label, keywords FROM cs_rag.agent_image",
);
await ingestWeb(pg, ApiKey, masters, {
  TagSelector: "p, h1, h2, h3, h4", // Selector for the HTML element to be retrieved
  Url: "https://", //URL to be ingested
  ChunkSize: 500, //Size chunk teks
  chunkOverlap: 100, //Overlap chunk
  tableName: "cs_rag.agent_knowledge", // The name of the destination table for storing embeddings
});

Embbedings Document

import "dotenv/config";
import { ingestDoc } from "crash-ai";
import pg from "../Config/PgDB.js";

const ApiKey = process.env.OpenKey;
//Master Mapping Image Label dan Keywords
const { rows: masters } = await pg.query(
  "SELECT label, keywords FROM cs_rag.agent_image",
);
await ingestDoc(pg, ApiKey, masters, {
  separators: ["***"] || ["###", "\n\n", "\r\n", "\n", "", " "], // Sperators
  File: "doc.pdf / .txt", //Document to be ingested
  ChunkSize: 500, //Size chunk teks
  chunkOverlap: 100, //Overlap chunk
  tableName: "cs_rag.agent_knowledge", // The name of the destination table for storing embeddings
});

Create Agent

import { CrashAI } from "crash-ai";
export class SupportAgent extends CrashAI.CreateAgent {
  async handle(question, history) {
    const { Sequence, Passthrough } = CrashAI.Runnables;
    const { StringParser, PromptTemplateAi } = CrashAI.Utils;
    const prompt = PromptTemplateAi.fromTemplate(`
     
     CONVERSATION HISTORY:
{chat_history}

CONTEXT (Hotel Internal Data):
{context}

IMAGE DATA:
Image Key: {img_key}

QUESTION: {question}

ANSWER RULES:

ANSWER:`);

    const retrieverInstance = await this.retriever;
    const docs = await retrieverInstance.invoke(question);
    const keywords = ["foto", "picture", "see", "type"]; //Triggers Keyword Send Image
    const isAskingImage = keywords.some((word) =>
      question.toLowerCase().includes(word),
    );
    const docWithImage = docs.find((d) => d.metadata && d.metadata.img_key);
    const detectedImgKey = isAskingImage
      ? docWithImage?.metadata?.img_key || null
      : null;
    const contextText = docs.map((d) => d.pageContent).join("\n\n");

    const chatChain = Sequence.from([
      {
        context: () => contextText,
        question: new Passthrough(),
        chat_history: () => history,
        img_key: () => detectedImgKey,
      },
      prompt,
      this.model,
      new StringParser(),
    ]);
    const response = await chatChain.invoke(question);
    const imageMatch = response.match(/\[SHOW_IMAGE:(.*?)\]/);
    const finalAnswer = response.replace(/\[SHOW_IMAGE:.*?\]/, "").trim();
    //Standart Return Class Agent
    return {
      answer: finalAnswer,
      imgKey: imageMatch ? imageMatch[1] : detectedImgKey || null,
    };
  }
}

Create Models Context Protocol And Agents

Please read at https://docs.langchain.com/oss/javascript/langchain/mcp , for more detailed configuration information.

import { CrashAI, CreatedProtocol, CreatedAgent } from "crash-ai";
const ProtocolMysql = await new CreatedProtocol(
  "mysql_server", // Name Server
  "npx", // Command Excute
  [
    "-y",
    "@berthojoris/mcp-mysql-server",
    "mysql://user:[email protected]:3307/databasename",
    "list,read,utility,create,update,ddl",
  ],
  ["list_tables", "describe_table", "execute_query", "query"],
).use();

const AgentTransaction = new CreatedAgent(
  "Agent Booking", //Name Agent
  "gpt-3.5-turbo", // Models
  ProtocolMysql, // Tools
  OpenKey, // Key API
);

const ResponseAgent = await AgentTransaction.run(
  question,
  history,
  "intent",
  `
  PROMPT YOUT AGENT
  `,
);

console.log("Transaction Agent Full Response:", ResponseAgent);

console.log(
  "Transaction Agent Response:",
  ResponseAgent.messages[ResponseAgent.messages.length - 1].content,
);

//Standart Return Class Agent
return {
  answer: ResponseAgent.messages[ResponseAgent.messages.length - 1].content,
  imgKey: null,
};

Example

import { CrashAI, HistoryStore } from "crash-ai";
import { SupportAgent } from "../Utils/SupportAgent.js";
import { ReservationAgent } from "../Utils/ReservationAgent.js";

//Init HistoryStore Redis
await HistoryStore.initConnection("redis://127.0.0.1:6379");

const BotAi = new CrashAI({
  ModelName: "gpt-4o-mini",
  ApiKey: process.env.OpenKey,
  Temperature: 0,
  tablename: "cs_rag.agent_knowledge",
  pool: "Pool Database",
  intentData: [
    "transaction",
    "information",
    "complaint",
    "cancellation",
    "question",
    "other",
  ],
  agentMapping: {
    transaction: ReservationAgent,
    information: SupportAgent,
    question: SupportAgent,
    other: SupportAgent,
  },
});

const historyStore = new HistoryStore(id_redis, "Jhondoe");
let convHistory = await historyStore.getMessages();
const history = convHistory.slice(-6);
console.log("Conversation History:", convHistory);
const question = "Where is the location?";
const { answer, imgKey } = await BotAi.GenerateAnswer(question, history);
if (answer) {
  await HistoryStore.addMessage(id_redis, "Jhondoe", question, "User");
  await HistoryStore.addMessage(id_redis, "Jhondoe", answer, "Ai");
  console.log(answer); // Answer
  if (imgKey) {
    const result = await getImageBylabel(imgKey); //Fetch label agent_image
    if (result) {
      console.log(result.image_url); // Url Picture
    }
  }
}

Database Setup

This library requires PostgreSQL with the pgvector extension. Run the following SQL command before starting:


CREATE EXTENSION IF NOT EXISTS vector;


CREATE SCHEMA IF NOT EXISTS cs_rag;


CREATE TABLE cs_rag.agent_knowledge (
    id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
    content TEXT,
    metadata jsonb,
    embedding vector(1536)
);


CREATE TABLE cs_rag.agent_image (
    id SERIAL PRIMARY KEY,
    label VARCHAR(255) UNIQUE,
    keywords TEXT[],
    image_url TEXT
);

addition

Important Actually, you don't need to bother learning how langchain works because this tool is already provided, but I hope you understand a little about the lifecycle of langchain working, using Redis and Postgree as the basis for vector data and history. I hope you can understand the basic settings for Redis and Postgree SQL, thank you.