langchain-memory-util
v1.0.0
Published
A reusable LangChain memory utility with support for local file and S3 storage, session-based memory management
Maintainers
Readme
LangChain Memory Utility
A reusable LangChain memory utility with support for local file and S3 storage, session-based memory management, and automatic conversation history tracking.
Features
- Flexible Storage: Store memory in local files or AWS S3
- Session-based Memory: Maintain separate memory for each session
- LangChain Integration: Compatible with LangChain's memory interface
- Automatic History: Automatically store and retrieve conversation history
- Easy to Use: Simple API for creating and managing memory
Installation
npm install langchain-memory-utilQuick Start
Local File Storage
const { MemoryStore, SessionMemory } = require('langchain-memory-util');
// Create a local file-based memory store
const store = new MemoryStore({
type: 'local',
basePath: './memory'
});
// Create session memory
const sessionId = 'user123';
const memory = new SessionMemory({ sessionId, store });
// Save conversation context
await memory.saveContext('Hello!', 'Hi there!');
await memory.saveContext('How are you?', 'I am doing well, thank you!');
// Retrieve history
const history = await memory.getHistory();
console.log(history);S3 Storage
const { MemoryStore, SessionMemory } = require('langchain-memory-util');
// Create an S3-based memory store
const store = new MemoryStore({
type: 's3',
bucket: 'my-memory-bucket',
s3Config: {
region: 'us-east-1',
credentials: {
accessKeyId: 'YOUR_ACCESS_KEY',
secretAccessKey: 'YOUR_SECRET_KEY'
}
}
});
const memory = new SessionMemory({ sessionId: 'user123', store });With LangChain LLM
const { MemoryStore, SessionMemory } = require('langchain-memory-util');
const { ChatOpenAI } = require('@langchain/openai');
const { ConversationChain } = require('langchain/chains');
// Setup memory
const store = new MemoryStore({ type: 'local', basePath: './memory' });
const memory = new SessionMemory({ sessionId: 'chat1', store });
// Create LLM with memory
const llm = new ChatOpenAI({
openAIApiKey: process.env.OPENAI_API_KEY,
temperature: 0.7
});
const chain = new ConversationChain({ llm, memory });
// Conversations are automatically stored
const response = await chain.call({ input: 'Hello, how are you?' });
console.log(response.response);
// Retrieve full conversation history
const history = await memory.getHistory();
console.log('Full conversation:', history);API Reference
MemoryStore
Constructor
new MemoryStore(config)Config Options:
type(string): 'local' or 's3' (default: 'local')basePath(string): Directory path for local storage (default: './memory')bucket(string): S3 bucket name (required for S3 type)s3Config(object): AWS S3 configuration object (required for S3 type)
Methods
save(sessionId, data): Save data for a sessionload(sessionId): Load data for a session
SessionMemory
Constructor
new SessionMemory({ sessionId, store })Parameters:
sessionId(string): Unique identifier for the sessionstore(MemoryStore): Memory store instance
Methods
saveContext(input, output): Save a conversation turnloadMemoryVariables(): Load memory variables (LangChain compatible)getHistory(): Retrieve full conversation historyinit(): Initialize memory (called automatically)
Examples
Run the included examples:
# Local storage example
npm test
# LLM integration example
npm run exampleLicense
MIT
