File size: 601 Bytes
103488e 26d42f7 103488e 26d42f7 103488e 26d42f7 103488e 26d42f7 103488e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
Website: liruiw.github.io/hpt
See the [HPT](https://github.com/liruiw/HPT-Pretrain) GitHub README and the [LeRobot](https://github.com/huggingface/lerobot) Implementation for instructions on how to use this checkpoint for fine-tuning.
BibTeX:
@inproceedings{wang2024hpt,
author={Lirui Wang, Xinlei Chen, Jialiang Zhao, Kaiming He, Russ Tedrake},
title={Scaling Proprioceptive-Visual Learning with Heterogeneous Pre-trained Transformers},
year={2024},
eprint={2407.16677},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2407.16677}}
Contact
Lirui Wang ([email protected]) |