-
Notifications
You must be signed in to change notification settings - Fork 20
Description
More recent memories are more relevant in most cases you want to use some sort of temporal re-ranker that effectively implements memory decay during the memory search retrieval phase. other options to explore include evicting outlier memories.
- implement a re-ranker component in neo4j-agent-memory that allows for reranking across different dimensions and is pluggable and extendable
- implement an exponential time decay function based re-ranker as the default memory reranker that prioritises more recent memories.
For example, some discussion on this topic here:
TIL: Memory decay actually makes retrieval BETTER, not worse
Was digging into cognitive science papers for our memory system and found something counterintuitive:Forgetting is a feature, not a bug.
Humans forget ~70% of new info within 24 hours (Ebbinghaus curve). Sounds bad. But heres the twist: this decay acts as a natural relevance filter. Old irrelevant stuff fades, frequently-accessed stuff strengthens.
We tried implementing this in our vector store. Instead of treating all memories equally, we added a decay factor (inspired by ACT-R, ~30 day > half-life). Memories that get retrieved boost their strength. Ones that dont gradually fade in retrieval priority.
Result: Search quality went UP. Why? Because when you search for "that API issue", you probably want the recent one you were working on, > not the similar issue from 6 months ago that got resolved.
The standard approach (store everything forever with equal weight) sounds better but actually creates noise. Your brain figured this out > millions of years ago.
Practical tip: If youre building any retrieval system, consider adding recency bias or access-frequency weighting. Not deleting old data, just deprioritizing it in search results.
Anyone else experimenting with memory decay? Curious what half-life values work for different use cases.