hacker news Hacker News
  1. new
  2. show
  3. ask
  4. jobs
Hi HN! Creator here. I built Story Keeper to solve a problem I kept hitting with AI agents: they remember everything but lose coherence over long conversations. The Core Idea Instead of storing chat history and retrieving chunks (RAG approach), Story Keeper maintains a living narrative:

Characters: Who you are (evolving), who the agent is Arc: Where you started → where you're going Themes: What matters to you Context: The thread connecting everything

Think of it as the difference between reading meeting notes vs. being in the relationship. Technical Approach ~200 lines of Python. Three primitives:

Story State (not message list) Story Evolution (not appending) Story-Grounded Response (not retrieval)

Works with any LLM - tested with GPT-4, Claude, Llama 3.1, Mistral. Why This Works Traditional memory is about facts. Story Keeper is about continuity. Example: Health coaching agent

Normal: Generic advice each time Story Keeper: "This is the pattern we identified last month. You do better with 'good enough' than perfect."

The agent carries forward understanding, not just data. Implementation Part of PACT-AX (open source agent collaboration framework). MIT licensed. Simple integration: pythonfrom pact_ax.primitives.story_keeper import StoryKeeper

keeper = StoryKeeper(agent_id="my-agent") response = keeper.process_turn(user_message) Use Cases I'm Exploring

Long-term coaching/mentorship Multi-session research assistants Customer support with relationship continuity Educational tutors that understand learning journeys

What I'd Love Feedback On

Is this solving a real problem or am I overthinking it? Performance concerns at scale? Other approaches people have tried for this? Use cases I'm missing?

The full technical writeup is in the repo blog folder. Happy to answer questions!

loading...