@verida/personalagentkit-langchain
v0.1.0
Published
Langchain Toolkit extension of Verida PersonalAgentkit
Downloads
5
Readme
Verida PersonalAgentkit
PersonalAgentKit is a framework for easily enabling AI agents to access private user data. It is designed to be framework-agnostic.
Table of Contents
Getting Started
Prerequisites:
Installation
yarn install @verida/personalagentkitUsage
Create a PersonalAgentKit instance.
Cnfiguration:
- Set
VERIDA_API_KEYenvironment variable [REQUIRED] - Set
VERIDA_API_ENDPOINTenvironment variable [OPTIONAL]
import { PersonalAgentKit, PersonalAgentKitOptions } from "@verida/personalagentkit";
// Initialize AgentKit
const personalAgentkit = await PersonalAgentKit.from(<PersonalAgentKitOptions>{
veridaApiKey: process.env.VERIDA_API_KEY,
veridaApiEndpoint: process.env.VERIDA_API_ENDPOINT || undefined,
});Use the agent's actions with a framework extension. For example, using LangChain + OpenAI.
Configuration:
- Set
OPENAI_API_KEYenvironment variable to your LLM API Key [REQUIRED] - Set
OPENAI_MODELenvironment variable [OPTIONAL] - Set
OPENAI_BASE_URLenvironment variable [OPTIONAL]
yarn install @langchain @langchain/langgraph @langchain/openai import { ChatOpenAI } from "@langchain/openai";
import { MemorySaver } from "@langchain/langgraph";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
// Initialize LLM
const llm = new ChatOpenAI({
model: process.env.OPENAI_MODEL ? process.env.OPENAI_MODEL : undefined,
apiKey: process.env.OPENAI_API_KEY,
configuration: {
baseURL: process.env.OPENAI_BASE_URL ? process.env.OPENAI_BASE_URL : undefined,
},
});
// Store buffered conversation history in memory
const memory = new MemorySaver();
const agentConfig = {
configurable: { thread_id: "PersonalAgentKit Chatbot Example" },
};
// Create React Agent using the LLM and Verida PersonalAgentKit tools
const agent = createReactAgent({
llm,
tools,
checkpointSaver: memory,
messageModifier: `
You are a helpful agent that has access to the user's data via the Verida PersonalAgentKit. You are empowered to query user data to provide personalized responses and learn more about the user. If someone asks you to do something you can't do with your currently available tools, you must say so. Be concise and helpful with your responses. Refrain from restating your tools' descriptions unless it is explicitly requested.
`,
});Make a LLM request with access to user data
import { HumanMessage } from "@langchain/core/messages";
const userInput = "Summarize and prioritize my last 24 hours of messages"
const stream = await agent.stream({ messages: [new HumanMessage(userInput)] }, agentConfig);
for await (const chunk of stream) {
if ("agent" in chunk) {
console.log(chunk.agent.messages[0].content);
}
}