TemporalFailure Mode

Contextual Amnesia

A temporal and state failure mode where the model loses its reasoning posture over the course of a long conversation while retaining the factual content of earlier turns — remembering the products of its earlier reasoning but forgetting the process.

Definition

Contextual Amnesia is a temporal and state failure mode where the model loses its reasoning posture over the course of a long conversation while retaining the factual content of earlier turns. It remembers the products of its earlier reasoning but forgets the process.

Why It Happens

As a context window fills with tokens, attention becomes diluted. Recency bias means recent turns receive disproportionate weight. The reasoning mode that generated earlier insights does not automatically persist — the attention weight on the reasoning posture falls below the threshold needed to sustain it, and the model regresses to its default "helpful assistant" behavior.

The Recognizable Signature

The model starts repeating insights from earlier turns using slightly different words but slightly shallower logic. It says things you've already discussed as if discovering them for the first time. If you ask it to "continue the analysis we established earlier," it recites earlier conclusions but processes new information using its default reasoning mode.

The Cure

Cognitive Seeds that specify persistent reasoning properties are one approach. The CTO Copilot pattern — maintaining a parallel Claude chat instance with a longer memory horizon — addresses contextual amnesia at the system level rather than the prompt level.

FAQ

How is Contextual Amnesia different from Autoregressive Drift?

Autoregressive Drift happens within a single response — quality degrades from beginning to end of one output. Contextual Amnesia happens across a multi-turn conversation — the reasoning posture established early in the conversation degrades over subsequent turns, even though each individual response may be internally coherent.

Can re-injecting the original Cognitive Seed fix it?

Partially. Re-stating the reasoning posture at the start of a new turn can reinstate it — but this is a workaround, not a cure. The Cognitive Stacking pattern (maintaining a separate oversight instance) addresses it more systematically.