The ContextWindow
AI remembers everything. Humans don't. Designing interfaces that bridge the memory gap.
Modern LLMs can hold 100,000 tokens in context. That's roughly a 300-page book. Humans can hold about four items in working memory. This gap, between what the AI remembers and what the user can track, is one of the core design challenges of AI interfaces.
The Memory Mismatch
In a chat interface, the AI remembers everything you've said. You don't. By message ten, you've forgotten the nuances of message three. You repeat yourself. You contradict yourself. You lose the thread. The AI is fine. You're drowning.
This isn't a user failure. It's a design failure. The interface should bridge the gap between AI memory and human memory, not expose it. Every piece of context the user needs should be visible, not buried in the scroll history.
The AI has perfect memory. The interface should give you the same advantage.
Persistent Context
- Tree structure keeps all options visible simultaneously
- Spatial position serves as a memory aid
- Color coding reduces what you need to remember
- Parent nodes provide context for every child node
- No scrolling means no information disappears from view
Designing for Human Limits
Persephonie doesn't expect you to remember. It externalizes memory into the spatial layout. Your question is at the top. Your options branch out. Your explored paths are color-coded. Everything you need to make a decision is visible at once. The interface remembers so you don't have to.
This is the fundamental shift: from interfaces that assume human memory to interfaces that replace it.
Morein Research
See EveryPath
Turn any question into a visual decision tree.