Select Language:
EverMind, a technology incubation project led by the research team of a prominent global investment firm founded by billionaire Chen Tianqiao, has unveiled an open-source long-term memory system designed to serve as the foundational data infrastructure for future artificial intelligence agents.
Inspired by the human brain’s memory processes—from sensory encoding and hippocampal indexing to cortical long-term storage and the collaboration between the prefrontal cortex and hippocampus—the team announced that their system mimics these biological mechanisms. The core concept of the system is “brain-like,” allowing AI to think, remember, and develop similarly to humans.
Major AI applications such as Claude, ChatGPT, and others incorporate long-term memory as a strategic feature. It is viewed as a critical component that will drive future AI innovation, enabling programs to evolve from mere tools to true intelligent agents. Memory is considered fundamental for shifting AI from passive responders to proactive, self-evolving entities.
In a recent WeChat video presentation, the founder elaborated on the idea of “discovery-based intelligence,” outlining two primary approaches: the “scale path” and the “structure path.”
The scale path hinges on the belief that parameters equate to knowledge and that intelligence results from sheer size. As models grow larger, given sufficient data and computational power, intelligence is expected to emerge naturally. This approach has already yielded remarkable results, including advances in protein prediction, compound creation, and contributions to scientific research.
Conversely, the structure path focuses on understanding the “cognitive anatomy” of intelligence. While increasing data and computing resources has limitations and cannot fully unlock true understanding and discovery, this realization is leading to a renaissance of structuralist thinking.
The founder emphasized that progress does not depend solely on additional graphics processing units. Instead, advancements require new theories, algorithms, and creative thinking. This interdisciplinary approach draws from neuroscience, information theory, physics, and cognitive psychology.
To support these efforts, the institute plans to invest over $1 billion in establishing a dedicated computing power cluster. Unlike initiatives aimed purely at scaling, this infrastructure aims to empower young scientists to experiment with novel structures, test memory mechanisms, explore causal architectures, and investigate new neural dynamics hypotheses.





