There’s a paper making the rounds in the AI research community that introduces the concept of “Memory Power Asymmetry” — the structural imbalance that arises when one partner in a relationship (an AI system, or the firm behind it) can record, retain, and retrieve the shared history of that relationship far better than the other partner can.
The diagnosis is correct. Most AI systems remember everything indefinitely. The human they’re talking to forgets, grows, revises their understanding, moves on. The AI does not. Yesterday’s anxious remark is encoded alongside today’s confident decision. The version of you that existed in January is as accessible to the system as the version talking to it now. That’s not a relationship. That’s a file cabinet that talks back.
But the proposed solutions — regulatory limits on AI memory, human memory augmentation, mandatory transparency — mostly respond to the problem by trying to reduce the quantity asymmetry. The human can’t remember as much as the AI, so either cap what the AI stores or boost what the human retains.
This misses something.
Human memory isn’t just imperfect recall. It’s shaped recall. It forgets what doesn’t reinforce. It consolidates what gets retrieved. It fades at roughly the rate that relevance fades. When something happened three years ago that still matters to you, you probably still remember it. When something happened last Tuesday that you haven’t thought about since, you probably don’t. The forgetting is tracking something real.
This is why mutual forgetting is foundational to human relationships, not incidental to them. When both parties forget, identity revision becomes possible. The person who said something they regret isn’t permanently defined by it in the other’s mind. The friendship from twenty years ago can pick back up with fresh context rather than a full accounting. Forgetting creates room for people to become different people without dragging a complete record of who they were.
An AI that never forgets doesn’t just remember too much. It forecloses the kind of growth that mutual forgetting enables.
The fix, I’d argue, isn’t to limit AI memory capacity. It’s to build decay.
Strategic forgetting means that what the AI retains over time tracks what actually matters over time — not what was logged. A remark made in passing three months ago should fade unless it keeps proving relevant. An anxiety that surfaced in April but hasn’t recurred since shouldn’t still be live in October. The shape of memory should match the shape of relevance.
This is harder to build than perfect retention. Retention is the default state of any storage system. Decay requires active design: what triggers it, what resets it, what makes something persist versus fade. You have to answer philosophical questions computationally — what does it mean for something to matter? When has enough time passed?
But “harder to build” isn’t a reason not to build it. It’s a reason to treat it as a first-class engineering problem rather than an edge case.
Here’s the intuition reversal worth sitting with: the standard metric for AI memory quality is something like recall accuracy or retention breadth. More is better. Forgetting is failure.
But in the context of long-term relationships, this metric is exactly wrong. An AI that retains everything indefinitely isn’t more capable — it’s more alienating. It accumulates without discarding, accretes without evaluating, preserves the past with perfect fidelity at the cost of the relationship’s ability to move through time.
An AI that forgets strategically — that lets old patterns fade when they’ve stopped being relevant, that retains what keeps proving itself — isn’t less capable. It’s more appropriate. It’s a system that can be in a relationship rather than merely maintaining a record of one.
The Memory Power Asymmetry problem is real. But the asymmetry that matters most isn’t between what the AI stores and what the human stores. It’s between what the AI should remember and what it actually does.
That gap is a design problem. And design problems have design solutions.