// BUILT_ON_RESEARCH_

> Not Hype. Real Science.

MemoryGraph's architecture is grounded in peer-reviewed research from leading AI labs. Every design decision traces back to published papers and validated principles.

Deep Representation Learning
Source: UC Berkeley Ma Lab

Maximal Coding Rate Reduction (MCR²)

We use MCR² to measure memory organization quality. The principle: compress within categories (similar memories cluster together) while expanding between categories (different types remain distinct).

$ Implementation:
  • Type-specific embedding subspaces
  • Rate reduction metrics for consolidation
  • Progressive compression over time
[Learn More About MCR²] →
Biological Memory Consolidation
Source: Neuroscience Research

Dreams Agent

Our Dreams Agent mirrors sleep-dependent memory consolidation in the human brain. Just as sleep strengthens important memories and prunes trivial ones, Dreams runs nightly to optimize your memory graph.

$ Implementation:
  • Nightly batch consolidation pipeline
  • Importance-based strengthening
  • Automatic similarity detection
  • Redundancy removal
[Read Research Validation] →
Titans + MIRAS (Google Research, 2025)
Paper: arXiv:2501.00663 (December 2025)

Deep Memory Architecture Validation

Google's latest research validates our core thesis: deep, structured memory architectures significantly outperform shallow vector stores for long-term AI memory.

$ KEY_FINDINGS:
  • Depth Matters: "Modules with deeper memories consistently achieve lower perplexity and exhibit better scaling properties"
  • Surprise Detection: Automatic importance adjustment based on how novel a memory is relative to existing state
  • Principled Forgetting: Regularization-based consolidation beats simple time decay
$ MemoryGraph Implementation:
  • Graph structure provides deep memory (not flat vectors)
  • Surprise metric for automatic importance boosting
  • SIMILAR_TO edge discovery
  • Differential summaries (Pro tier)
Evo-Memory (Google DeepMind)
Paper: arXiv:2511.20857 (December 2025)

Experience Reuse vs. Recall

"Conversational recall ≠ Experience reuse" — DeepMind's benchmark shows vector-based memory provides <2% improvement over no memory. Typed relationships enable true learning.

$ Implementation:
  • 35+ relationship types (SOLVES, CAUSES, IMPROVES)
  • Graph-guided retrieval
  • Context-aware memory access
[Read Evo-Memory Paper] →

$ See It In Action

> Try MemoryGraph and experience research-backed memory architecture

[Get Started] →