@elena1daniel
How does it = zero hallucinations? Humans confabulate all the time. Confabulations are a default property of thinking as we know. A non-confabulating mind, human or AI = a limited-capacity computer, not a creative-thinking mind.
Tweet analysis of Large Memory Models, a human memory AI architecture. Sentiment: 30% supportive, 20% confronting. Founders' pubs; closed Harvard lab.
Ok, this is pretty interesting. These guys built a completely new architecture: Large Memory Models. This is designed specifically for how human memory works. Instead of RAG or vector search, this is a different paradigm. Their founders have 160+ publications in Nature and ICLR, and closed their Harvard lab to build this.
Real-time analysis of public opinion and engagement
What the community is saying — both sides
frequent implementations “don’t really solve the core problem,” prompting doubt about their long-term effectiveness.
readers want to know how retrieval is being structured when it’s explicitly not using vector embeddings.
straightforward demand: “Links to papers?” — people want academic sources or technical details before buying the claims.
if the approach works, value shifts from better answers to systems that remember, adapt, and evolve over time.
enthusiasm and hype: “Engramme cooking so hot right now” signals community buzz and high expectations.
the idea that moving beyond RAG to model human‑like memory (context retention, recall, adaptation) could solve major limitations, but only if those capabilities are genuinely achieved.
storing and re-injecting user data from the cloud isn’t true memory — real memory is an internal, self-updating state, not external records pasted back into prompts.
confabulation is intrinsic to thinking; eliminating it entirely would produce a limited, non-creative system rather than a human-like mind.
the presentation lacks clear mechanisms or evidence, making the claim hard to evaluate.
the announcement reads like a paid partnership, so treat the motivations and claims with caution.
Most popular replies, ranked by engagement
How does it = zero hallucinations? Humans confabulate all the time. Confabulations are a default property of thinking as we know. A non-confabulating mind, human or AI = a limited-capacity computer, not a creative-thinking mind.
Links to papers?
Interesting? More like vague and opaque.
If this works, memory becomes the product Not just better answers, but systems that actually remember, adapt, and evolve over time
That’s a bold shift. If they can truly move beyond RAG and model something closer to human-like memory—context retention, recall, and adaptation—that could solve a lot of current limitations.
If your “persistent memory” requires storing user data in the cloud and retrieving it later, that’s not memory that’s a database. Real memory is internal state that updates itself, not external storage injected back into a prompt.
Found something wrong with this article? Let us know and we'll look into it.