Dexterous manipulation suffers from embodiment-specific dynamics: every new hand typically needs retuning or retraining. DexFormer builds an embodiment-agnostic transformer policy that conditions on recent observation history to implicitly infer morphology and produce embodiment-appropriate actions without identifiers or per-hand heads.
History-Conditioned
Temporal tokens let the transformer infer hand dynamics online, avoiding explicit morphology codes.
Shared Action Space
Canonical finger embedding aligns actuators across hands while zero-padding missing joints.
Scaling Across Morphologies
Trained on 300 randomized embodiments, evaluated zero-shot on standard LEAP, Allegro, and RAPID hands and 32 novel variants.
Canonical 20-D action space: identical anatomical joints map to fixed indices; lower-DoF hands zero-pad unused slots.
DexFormer unifies heterogeneous hands with a shared finger action space: joints with the same anatomical role (MCP abduction/flexion, PIP/DIP flexion, thumb CMC/IP) occupy fixed canonical indices. Each embodiment writes its native joint commands into those slots and zero-pads missing joints; masks ensure only valid dimensions are applied. This alignment lets one policy head emit a single 20-D finger command vector that works across LEAP, Allegro, RAPID, and their randomized variants.
The policy consumes a fixed horizon of observation history and passes it through a causal transformer encoder. The final MLP layer outputs the shared action; embodiment masks and smoothing translate it to joint targets.
| Hand | Setting | LSTM | GRU | Ours |
|---|---|---|---|---|
| LEAP | canonical | 66.81 | 58.91 | 83.25 |
| 32 variants | 66.72 | 57.38 | 86.84 | |
| Allegro | canonical | 65.38 | 25.81 | 74.19 |
| 32 variants | 62.44 | 24.97 | 71.94 | |
| RAPID | canonical | 46.72 | 45.06 | 71.69 |
| 32 variants | 44.22 | 53.59 | 77.09 | |
| Average | Combined | 58.72 | 44.29 | 77.50 |
Preprint PDF: dexformer.pdf
@article{dexformer2026,
title={DexFormer: Cross-Embodied Dexterous Manipulation via History-Conditioned Transformer},
author={Ke Zhang, Lixin Xu, Chengyi Song, Junzhe Xu, Xiaoyi Lin, Zeyu Jiang, Renjing Xu},
journal={preprint},
year={2026}
}
Questions? Drop a note in the repo or contact the paper authors.