@aigne/example-memory
v0.10.69
Published
A demonstration of using AIGNE Framework with memory.
Readme
Memory Example
This example demonstrates how to create a chatbot with memory capabilities using the AIGNE Framework and AIGNE CLI. The example utilizes the AIGNE File System (AFS) with memory modules to provide persistence across chat sessions.
AIGNE File System (AFS) is a virtual file system abstraction that provides AI agents with unified access to various storage backends, including conversation history and user profiles. For comprehensive documentation, see AFS Documentation.
Prerequisites
- Node.js (>=20.0) and npm installed on your machine
- An OpenAI API key for interacting with OpenAI's services
- Optional dependencies (if running the example from source code):
Quick Start (No Installation Required)
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
# Run the chatbot with memory
npx -y @aigne/example-memory --input 'I like blue color'
npx -y @aigne/example-memory --input 'What is my favorite color?'
# Run in interactive chat mode
npx -y @aigne/example-memory --chatInstallation
Clone the Repository
git clone https://github.com/AIGNE-io/aigne-frameworkInstall Dependencies
cd aigne-framework/examples/memory
pnpm installSetup Environment Variables
Setup your OpenAI API key in the .env.local file:
OPENAI_API_KEY="" # Set your OpenAI API key hereUsing Different Models
You can use different AI models by setting the MODEL environment variable along with the corresponding API key. The framework supports multiple providers:
- OpenAI:
MODEL="openai:gpt-4.1"withOPENAI_API_KEY - Anthropic:
MODEL="anthropic:claude-3-7-sonnet-latest"withANTHROPIC_API_KEY - Google Gemini:
MODEL="gemini:gemini-2.0-flash"withGEMINI_API_KEY - AWS Bedrock:
MODEL="bedrock:us.amazon.nova-premier-v1:0"with AWS credentials - DeepSeek:
MODEL="deepseek:deepseek-chat"withDEEPSEEK_API_KEY - OpenRouter:
MODEL="openrouter:openai/gpt-4o"withOPEN_ROUTER_API_KEY - xAI:
MODEL="xai:grok-2-latest"withXAI_API_KEY - Ollama:
MODEL="ollama:llama3.2"withOLLAMA_DEFAULT_BASE_URL
For detailed configuration examples, please refer to the .env.local.example file in this directory.
Run the Example
pnpm startHow Memory Works
The memory functionality in this example is implemented using the history module and UserProfileMemory module,
which are part of the AIGNE Framework's AIGNE File System (AFS). Here's a brief overview of how memory is recorded and retrieved during conversations.
Recording Conversations
- When the user sends a message and got a response from the AI model,
the conversation pair (user message and AI response) is stored in the
historymodule of theAFS. - The
UserProfileMemorymodule extracts relevant user profile information (by analyzing the conversation history) from the conversation and stores it in theuser_profilemodule of theAFS.
Retrieving Conversations
- Load User Profile Memory and prepare to inject into the system prompt to let the agent know about the user profile.
You are a friendly chatbot
<related-memories>
- |
name:
- name: Bob
interests:
- content: likes blue color
</related-memories>- Inject conversation history into the chat messages
[
{
"role": "system",
"content": "You are a friendly chatbot ..." // UserProfileMemory injected here
},
// Followed by nearby conversation history
{
"role": "user",
"content": [
{
"type": "text",
"text": "I'm Bob and I like blue color"
}
]
},
{
"role": "agent",
"content": [
{
"type": "text",
"text": "Nice to meet you, Bob! Blue is a great color.\n\nHow can I help you today?"
}
]
},
// Here is the last user message
{
"role": "user",
"content": [
{
"type": "text",
"text": "What is my favorite color?"
}
]
}
]- The AI model processes the injected messages and generates a response based on the user's profile and conversation history.
AI Response:
You mentioned earlier that you like the color blue