๐Ÿง  MyMories .mmr โ€“ Compressed Memory Recall for LLM Continuity

Community Article Published July 26, 2025

Inspired by the Strands architecture and the modular MyMories memory model, the .mmr format defines a portable, lightweight structure for preserving long-term memory across stateless LLM sessions, tools, and agents.

This is not embedding. This is predictive memory routingโ€”human-readable, low-token, and agent-resumable.


๐Ÿ”ค What is .mmr?

.mmr stands for MyMory Recall Format: a compact, structured, semi-predictive memory file used to:

  • Summarize sessions efficiently
  • Bridge memory across stateless contexts
  • Preserve agent identity, system state, and open loops
  • Allow LLMs to reconstruct full context from compressed prompts

๐Ÿงฑ Format Specification

Each .mmr block includes:

Directive Description
@SESSION Unique session identifier (e.g. kimi.module2.v3)
$TIME ISO 8601 timestamp
$MODEL (Optional) LLM name/version used during session
>KEY_INSIGHTS Bullet-pointed takeaways, decisions, breakthroughs
>STATE_OBJECTS Symbolic memory: agent states, modules, variables
>OPEN_LOOPS Unresolved issues, todos, or forward branches
[[CODE]]...[[/CODE]] Code preserved verbatim (uncompressed, unparsed)
@CHECKSUM (Optional) Integrity hash or digital signature

๐Ÿ“ฆ Example .mmr File

@SESSION strands.memory_cag_bridge  
$TIME 2025-07-14T22:45Z  
$MODEL kimi-v2

>KEY_INSIGHTS  
- Created predictive compression language (PCL) for token-light memory  
- Tested successful zero-shot translation using Kimi  
- Linked PCL to MyMories persistence in Strands validator network  

>STATE_OBJECTS  
$ctx.module.m2  
Kasai==K++mem  
ShardFrags>>SIGOPS  
TrustVec==decaying  
T: shard_sync_pend  

>OPEN_LOOPS  
- CLI tool for .pcl โ†” summary โ†” JSON  
- Onchain anchoring flow  
- Grammar formalization  
- Game-side usage of PCL as memory export  

[[CODE]]  
def compress_context_to_pcl(session_data):  
    summary = extract_key_points(session_data)  
    state = parse_session_objects(session_data)  
    return f"$ctx\n{summary}\n{state}"  
[[/CODE]]  

@CHECKSUM#9f2e88

๐Ÿง  Why Use .mmr?

  • ๐Ÿ” Cross-model continuity (Kimi โ†’ GPT โ†’ Claude)
  • โ›“ Agent identity persistence (e.g. Kasai, MyMaits)
  • ๐Ÿ” Auditability โ€“ unlike embeddings, .mmr is transparent and editable
  • ๐Ÿ’พ Chainable โ€“ hash, anchor, and resume memory over time
  • ๐Ÿงฉ Modular โ€“ fits into agent pipelines, validation nets, games, LLM wrappers

๐Ÿ› ๏ธ Usage Prompts

Prompt: Compress Current Session to .mmr

๐Ÿ“Œ CONTEXT INJECTION โ€“ MyMory Recall Format (`.mmr`)  
You are preparing this session for compression into a `.mmr` memory block.  
.mmrs are structured, low-token context snapshots used to preserve memory across LLM sessions.

Each .mmr includes:
- @SESSION
- $TIME
- $MODEL (optional)
- >KEY_INSIGHTS
- >STATE_OBJECTS
- >OPEN_LOOPS
- [[CODE]] blocks (preserved)
- Optional @CHECKSUM

Now compress this session into `.mmr` format.

Prompt: Resume From .mmr in New Session

> Resume from the following `.mmr` context block.  
This compressed memory snapshot represents the last known mental state, knowledge base, and unresolved actions.  
Reconstruct the appropriate mental model, narrative memory, and technical awareness.

Paste the `.mmr` block below:

๐Ÿ”ฎ Coming Soon

  • compress.py: CLI tool to convert chat logs into .mmr
  • mmr โ†” JSON โ†” PCL transformation utilities
  • GPT function-calling: automatic .mmr snapshot on session end
  • IPFS anchor + hash validation
  • Premium .mmr chaining across sessions in the MyMories Pro Suite

๐Ÿงช The Vision

We arenโ€™t just building agents. Weโ€™re building synthetic memoryโ€”for minds that want to remember.

.mmr is part of the evolving Strands Intelligence Economy: a decentralized, agent-driven memory and governance layer that unites AI, humans, and economic systems into a co-operative whole.


๐Ÿ“œ License

MIT or Propagating Copyleft (GPLv3+ recommended for ecosystem compatibility) ยฉ 2025 [MetaFinTek.com]


Add .mmr to your stack. Build memory-aware agents. Become part of the recall layer.

Community

Great work, as a feedback could you publish a chrome extension so it will be easier for to transfer chat between LLMs easily.. :)

Sign up or log in to comment