@aigne/example-afs-memory
v0.10.78
Published
A demonstration of using AIGNE Framework with memory-based AFS modules
Downloads
2,984
Readme
Memory Example
This example demonstrates how to create a chatbot with long-term memory using the AIGNE Framework. The chatbot can remember user information across conversations and recall details shared in previous interactions.
The example uses two powerful AFS (Agentic File System) modules:
- AFSHistory: Automatically records all conversations for context
- UserProfileMemory: Intelligently extracts and stores user information (name, interests, location, etc.)
Agentic File System (AFS) is a virtual file system abstraction that provides AI agents with unified access to various storage backends. For comprehensive documentation, see AFS Documentation.
Prerequisites
- Node.js (>=20.0) and npm installed on your machine
- An OpenAI API key for interacting with OpenAI's services
- Optional dependencies (if running the example from source code):
Quick Start (No Installation Required)
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
# First conversation - introduce yourself
npx -y @aigne/example-afs-memory --input "I'm Bob, and I like blue color"
# Response: Nice to meet you, Bob — I've saved that your favorite color is blue...
# Second conversation - the bot remembers you!
npx -y @aigne/example-afs-memory --input "Tell me all info about me you known"
# Response: Here's what I currently have stored about you:
# * Name: Bob
# * Interests / favorite color: blue
# Run in interactive chat mode
npx -y @aigne/example-afs-memory --chatConnect to an AI Model
As an example, running npx -y @aigne/example-afs-memory --input "I'm Bob, and I like blue color" requires an AI model. If this is your first run, you need to connect one.

- Connect via the official AIGNE Hub
Choose the first option and your browser will open the official AIGNE Hub page. Follow the prompts to complete the connection. If you're a new user, the system automatically grants 400,000 tokens for you to use.

- Connect via a self-hosted AIGNE Hub
Choose the second option, enter the URL of your self-hosted AIGNE Hub, and follow the prompts to complete the connection. If you need to set up a self-hosted AIGNE Hub, visit the Blocklet Store to install and deploy it: Blocklet Store.

- Connect via a third-party model provider
Using OpenAI as an example, you can configure the provider's API key via environment variables. After configuration, run the example again:
export OPENAI_API_KEY="" # Set your OpenAI API key hereFor more details on third-party model configuration (e.g., OpenAI, DeepSeek, Google Gemini), see .env.local.example.
After configuration, run the example again.
Debugging
The aigne observe command starts a local web server to monitor and analyze agent execution data. It provides a user-friendly interface to inspect traces, view detailed call information, and understand your agent’s behavior during runtime. This tool is essential for debugging, performance tuning, and gaining insight into how your agent processes information and interacts with tools and models.
Start the observation server.

View a list of recent executions.

Installation
Clone the Repository
git clone https://github.com/AIGNE-io/aigne-frameworkInstall Dependencies
cd aigne-framework/examples/afs-memory
pnpm installSetup Environment Variables
Setup your OpenAI API key in the .env.local file:
OPENAI_API_KEY="" # Set your OpenAI API key hereUsing Different Models
You can use different AI models by setting the MODEL environment variable along with the corresponding API key. The framework supports multiple providers:
- OpenAI:
MODEL="openai:gpt-4.1"withOPENAI_API_KEY - Anthropic:
MODEL="anthropic:claude-3-7-sonnet-latest"withANTHROPIC_API_KEY - Google Gemini:
MODEL="gemini:gemini-2.0-flash"withGEMINI_API_KEY - AWS Bedrock:
MODEL="bedrock:us.amazon.nova-premier-v1:0"with AWS credentials - DeepSeek:
MODEL="deepseek:deepseek-chat"withDEEPSEEK_API_KEY - OpenRouter:
MODEL="openrouter:openai/gpt-4o"withOPEN_ROUTER_API_KEY - xAI:
MODEL="xai:grok-2-latest"withXAI_API_KEY - Ollama:
MODEL="ollama:llama3.2"withOLLAMA_DEFAULT_BASE_URL
For detailed configuration examples, please refer to the .env.local.example file in this directory.
Run the Example
# Run in interactive mode
pnpm start
# Run with a single message
pnpm start --input "I'm Bob, and I like blue color"Example Conversation Flow
# First conversation - introduce yourself
$ pnpm start --input "I'm Bob, and I like blue color"
Response: Nice to meet you, Bob — I've saved that your favorite color is blue.
If you'd like, I can remember more about you (location, hobbies, projects, etc.)...
# Second conversation - the bot remembers you!
$ pnpm start --input "Tell me all info about me you known"
Response: Here's what I currently have stored about you:
* Name: Bob
* Interests / favorite color: blue
Would you like to add or update anything (location, hobbies, projects, family, etc.)?How Memory Works
This example uses two complementary memory modules that work together to provide intelligent, personalized conversations:
1. AFSHistory Module - Conversation Context
Purpose: Records every conversation turn to provide recent context.
How it works:
- Automatically saves each user message and AI response pair
- Stores conversations with timestamps and unique IDs
- Enables the AI to reference recent conversations
Example:
# First conversation
$ pnpm start --input "I'm Bob, and I like blue color"
Response: Nice to meet you, Bob — I've saved that your favorite color is blue...
# The conversation is automatically saved to history.sqlite32. UserProfileMemory Module - User Information Extraction
Purpose: Intelligently extracts and stores structured user information from conversations.
How it works:
- Listens to conversation history events
- Analyzes each conversation using AI to identify user information
- Extracts relevant details (name, interests, location, family, projects, etc.)
- Stores in a structured JSON profile in user_profile.sqlite3
- Updates incrementally using JSON Patch operations
What it remembers:
- Name and personal details
- Location (country, city, address)
- Interests and hobbies
- Family members and relationships
- Projects and work
- Languages spoken
- Birthday and other personal info
Example:
# After Bob introduces himself, the profile is automatically created:
{
"name": [{ "name": "Bob" }],
"interests": [{ "content": "blue color" }]
}
# In the next conversation, the bot can recall this information:
$ pnpm start --input "Tell me all info about me you known"
Response: Here's what I currently have stored about you:
* Name: Bob
* Interests / favorite color: blue3. Memory Injection - How the AI Uses Memory
When you send a message, the system automatically:
Step 1: Inject User Profile into System Prompt
You are a friendly chatbot
<related-memories>
User Profile Memory: This contains structured information about the user...
- name:
- name: Bob
interests:
- content: blue color
</related-memories>Step 2: Add Recent Conversation History
[
{
"role": "system",
"content": "You are a friendly chatbot... [profile injected here]"
},
{
"role": "user",
"content": "I'm Bob and I like blue color"
},
{
"role": "assistant",
"content": "Nice to meet you, Bob..."
},
{
"role": "user",
"content": "Tell me all info about me you known"
}
]Step 3: AI Generates Personalized Response
The AI can now:
- Address the user by name
- Reference their interests
- Provide personalized recommendations
- Maintain context across sessions
Key Design Benefits
- Automatic: No manual profile management needed
- Intelligent: AI determines what's important to remember
- Incremental: Profile updates gradually over time
- Persistent: Memory survives across multiple conversations
- Structured: Information is organized in a consistent format
- Privacy-Aware: All data stored locally in SQLite
Related Examples
- AFS LocalFS Example - File system access with AI agents
- AFS MCP Server Example - Mount MCP servers as AFS modules
Related Packages
- @aigne/afs - AFS core package
- @aigne/afs-user-profile-memory - User profile memory module
TypeScript Support
This package includes full TypeScript type definitions.
