Tools: New Architecture Of Persistent Memory For Claude Code 2026
Claude Code is powerful inside a session. But between sessions, it has amnesia. Every new conversation starts from zero. On a complex project, this means spending the first few minutes re-explaining your architecture, your decisions, your conventions, before you can do any real work.
I built an open-source MCP server called memory-mcp that fixes this. Here's how the architecture works.
Claude Code already reads a file called CLAUDE.md on every session start. This is baked into how Claude Code works. Whatever is in that file becomes part of Claude's initial context.
Claude Code also has a hook system. You can register shell commands that fire after every response (Stop), before context compaction (PreCompact), and at session end (SessionEnd).
These two features, combined, are everything you need for persistent memory. The hooks capture knowledge. CLAUDE.md delivers it. No custom protocols, no external databases, no changes to how Claude Code works.
Not all memory is equal. Some context is needed every session. Some is only needed occasionally. Putting everything into CLAUDE.md wastes the context window. Putting everything into a database means Claude has to actively search for basics.
Tier 1: CLAUDE.md (~150 lines) A compact briefing document, auto-generated and auto-updated. Contains the most important project knowledge ranked by confidence and access frequency. Claude reads this automatically on startup with zero prompting.
Tier 2: .memory/state.json (unlimited) The full memory store. Every fact, decision, and observation ever captured. Accessible mid-conversation through MCP tools: keyword search, tag-based queries, and natural language questions synthesized by Haiku.
80% of sessions need only Tier 1. The other 20% can pull from Tier 2 on demand.
When a hook fires, the extractor reads the conversation transcript from where it last left off (tracked by a cursor file). If the new content exceeds 6,000 characters, it's chunked to stay within Haiku's sweet spot.
Source: Dev.to