// BUILT_ON_RESEARCH_
> Not Hype. Real Science.
MemoryGraph's architecture is grounded in peer-reviewed research from leading AI labs. Every design decision traces back to published papers and validated principles.
Maximal Coding Rate Reduction (MCR²)
We use MCR² to measure memory organization quality. The principle: compress within categories (similar memories cluster together) while expanding between categories (different types remain distinct).
- Type-specific embedding subspaces
- Rate reduction metrics for consolidation
- Progressive compression over time
Dreams Agent
Our Dreams Agent mirrors sleep-dependent memory consolidation in the human brain. Just as sleep strengthens important memories and prunes trivial ones, Dreams runs nightly to optimize your memory graph.
- Nightly batch consolidation pipeline
- Importance-based strengthening
- Automatic similarity detection
- Redundancy removal
Deep Memory Architecture Validation
Google's latest research validates our core thesis: deep, structured memory architectures significantly outperform shallow vector stores for long-term AI memory.
- Depth Matters: "Modules with deeper memories consistently achieve lower perplexity and exhibit better scaling properties"
- Surprise Detection: Automatic importance adjustment based on how novel a memory is relative to existing state
- Principled Forgetting: Regularization-based consolidation beats simple time decay
- Graph structure provides deep memory (not flat vectors)
- Surprise metric for automatic importance boosting
- SIMILAR_TO edge discovery
- Differential summaries (Pro tier)
Experience Reuse vs. Recall
"Conversational recall ≠ Experience reuse" — DeepMind's benchmark shows vector-based memory provides <2% improvement over no memory. Typed relationships enable true learning.
- 35+ relationship types (SOLVES, CAUSES, IMPROVES)
- Graph-guided retrieval
- Context-aware memory access
$ See It In Action
> Try MemoryGraph and experience research-backed memory architecture
[Get Started] →