DexTrack: Towards Generalizable Neural Tracking Control for Dexterous Manipulation from Human References
Abstract
We address the challenge of developing a generalizable neural tracking controller for dexterous manipulation from human references. This controller aims to manage a dexterous robot hand to manipulate diverse objects for various purposes defined by kinematic human-object interactions. Developing such a controller is complicated by the intricate contact dynamics of dexterous manipulation and the need for adaptivity, generalizability, and robustness. Current reinforcement learning and trajectory optimization methods often fall short due to their dependence on task-specific rewards or precise system models. We introduce an approach that curates large-scale successful robot tracking demonstrations, comprising pairs of human references and robot actions, to train a neural controller. Utilizing a data flywheel, we iteratively enhance the controller's performance, as well as the number and quality of successful tracking demonstrations. We exploit available tracking demonstrations and carefully integrate reinforcement learning and imitation learning to boost the controller's performance in dynamic environments. At the same time, to obtain high-quality tracking demonstrations, we individually optimize per-trajectory tracking by leveraging the learned tracking controller in a homotopy optimization method. The homotopy optimization, mimicking chain-of-thought, aids in solving challenging trajectory tracking problems to increase demonstration diversity. We showcase our success by training a generalizable neural controller and evaluating it in both simulation and real world. Our method achieves over a 10% improvement in success rates compared to leading baselines. The project website with animated results is available at https://meowuu7.github.io/DexTrack/.
Community
🤖🧐Can human data be leveraged to advance robots' dexterous manipulation capabilities?
#ICLR We introduce DexTrack🍀 (meowuu7.github.io/DexTrack/), a generalizable controller able to track diverse, complex, and even noisy human manipulation references, with transferability to real.
For details, please check our
Website: https://meowuu7.github.io/DexTrack/
Paper: https://openreview.net/pdf?id=ajSmXqgS24
Video: https://youtu.be/zru1Z-DaiWE
Code: https://github.com/Meowuu7/DexTrack
(Easy to set up and start to run - within 20 mins)
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Dexterous Manipulation Based on Prior Dexterous Grasp Pose Knowledge (2024)
- RobotMover: Learning to Move Large Objects by Imitating the Dynamic Chain (2025)
- Mimicking-Bench: A Benchmark for Generalizable Humanoid-Scene Interaction Learning via Human Mimicking (2024)
- Learning to Transfer Human Hand Skills for Robot Manipulations (2025)
- DexterityGen: Foundation Controller for Unprecedented Dexterity (2025)
- Human-Humanoid Robots Cross-Embodiment Behavior-Skill Transfer Using Decomposed Adversarial Learning from Demonstration (2024)
- COMBO-Grasp: Learning Constraint-Based Manipulation for Bimanual Occluded Grasping (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper