|
--- |
|
license: mit |
|
--- |
|
# Echo: A Large Language Model with Temporal Episodic Memory |
|
|
|
--- |
|
|
|
<p align="center"> |
|
<img src="https://huggingface.co/front/assets/huggingface_logo-noborder.svg" width="120" alt="HuggingFace Logo"/> |
|
</p> |
|
|
|
--- |
|
|
|
<p align="center"> |
|
<a href="https://github.com/1920993165/echo">π οΈ <b>Code</b></a> | |
|
<a href="https://huggingface.co/datasets/ALmonster/Echo-v1">π <b>Dataset</b></a> | |
|
<a href="https://huggingface.co/ALmonster/Echo1-7B">π§ <b>7B Model</b></a> | |
|
<a href="https://huggingface.co/ALmonster/Echo1-72B">π§ <b>72B Model</b></a> |
|
</p> |
|
|
|
--- |
|
|
|
## π Welcome to Echo! |
|
|
|
**Echo** is a cutting-edge large language model (LLM) designed with temporal episodic memory, enabling advanced reasoning and context retention. We invite the community to explore, use, and cite our work! |
|
|
|
--- |
|
|
|
## π Citation |
|
|
|
If you use Echo in your research or applications, please cite us: |
|
|
|
```bibtex |
|
@misc{liu2025echolargelanguagemodel, |
|
title={Echo: A Large Language Model with Temporal Episodic Memory}, |
|
author={WenTao Liu and Ruohua Zhang and Aimin Zhou and Feng Gao and JiaLi Liu}, |
|
year={2025}, |
|
eprint={2502.16090}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.CL}, |
|
url={https://arxiv.org/abs/2502.16090}, |
|
} |
|
``` |
|
|
|
--- |
|
|
|
## π Get Involved |
|
|
|
We welcome contributions, feedback, and collaboration from the community. Feel free to open issues or pull requests on our [GitHub](https://github.com/1920993165/echo)! |
|
|
|
--- |
|
|
|
<p align="center"> |
|
<b>Empowering AI with memory. Echo: Remember, Reason, Respond.</b> |
|
</p> |