langchain-gigachat
v0.0.14
Published
GigaChat integration for LangChain.js
Readme
langchain-gigachat
This is a library integration with GigaChat.
Installation
npm install --save langchain-gigachatQuickstart
Follow these simple steps to get up and running quickly.
Installation
To install the package use following command:
npm install --save langchain-gigachatInitialization
To initialize chat model:
import { GigaChat } from "langchain-gigachat"
import { Agent } from 'node:https';
const httpsAgent = new Agent({
rejectUnauthorized: false,
});
const giga = new GigaChat({
credentials: 'YOUR_AUTHORIZATION_KEY',
model: 'GigaChat-Max',
httpsAgent
})Usage
Use the GigaChat object to generate responses:
import { HumanMessage, SystemMessage } from "@langchain/core/messages";
const messages = [
new SystemMessage("Translate following messages to portugese"),
new HumanMessage("Hello, world!"),
];
const resp = await giga.invoke(messages);
console.log(resp.content);Use the GigaChat object to create embeddings:
import { GigaChatEmbeddings } from "langchain-gigachat";
import { Agent } from 'node:https';
const httpsAgent = new Agent({
rejectUnauthorized: false,
});
async function main() {
const embeddings = new GigaChatEmbeddings({
credentials: 'YOUR_AUTHORIZATION_KEY',
httpsAgent
});
console.log(await embeddings.embedDocuments(["Словасловаслова"]));
}
main();Now you can use the GigaChat object with LangChainJS's standard primitives to create LLM-applications.
