2.9 KiB
2.9 KiB
Memory Architecture
The memory system is inspired by Stanford's "Generative Agents" research, implementing a sophisticated memory model that enables realistic long-term character development.
🧠 Memory Types
Observations
- What: Direct experiences and perceptions
- Examples: "I spilled coffee", "Emma smiled at me", "It's raining outside"
- Importance: Usually 1-5 for mundane events, 8-10 for significant experiences
- Purpose: Raw building blocks of character experience
Reflections
- What: Higher-level insights generated from observation patterns
- Examples: "I have romantic feelings for Emma", "I'm naturally shy in social situations"
- Importance: Usually 6-10 (insights are more valuable than raw observations)
- Purpose: Character self-understanding and behavioral consistency
Plans
- What: Future intentions and goals
- Examples: "I want to ask Emma about her art", "I should finish my thesis chapter"
- Importance: 3-10 depending on goal significance
- Purpose: Drive future behavior and maintain character consistency
🔍 Memory Retrieval
Smart Retrieval Algorithm
Memories are scored using three factors:
-
Recency - Recent memories are more accessible
recency = 0.995 ** hours_since_last_accessed -
Importance - Significant events stay memorable longer
importance = memory.importance_score / 10.0 -
Relevance - Contextually similar memories surface together
relevance = cosine_similarity(query_embedding, memory_embedding)
Final Score
score = recency + importance + relevance
🎯 Automatic Reflection Generation
When accumulated importance of recent memories exceeds threshold (150):
- Analyze Recent Experiences: Get last 20 observations
- Generate Insights: Use LLM to identify patterns and higher-level understanding
- Create Reflections: Store insights as new reflection memories
- Link Evidence: Connect reflections to supporting observations
💾 Memory Storage
Each memory contains:
description: Natural language contentcreation_time: When the memory was formedlast_accessed: When it was last retrieved (affects recency)importance_score: 1-10 significance ratingembedding: Vector representation for similarity matchingmemory_type: observation/reflection/planrelated_memories: Links to supporting evidence
🔄 Memory Lifecycle
- Creation: New experience becomes observation
- Scoring: LLM rates importance 1-10
- Storage: Added to memory stream with embedding
- Retrieval: Accessed during relevant conversations
- Reflection: Patterns trigger insight generation
- Evolution: Older memories naturally fade unless repeatedly accessed
This creates realistic, human-like memory behavior where important experiences remain accessible while mundane details naturally fade over time.