GEAR-SONIC: Supersizing Motion Tracking for Natural Humanoid Whole-Body Control
Model Description
SONIC (Supersizing Motion Tracking) is a humanoid behavior foundation model developed by NVIDIA that gives robots a core set of motor skills learned from large-scale human motion data. Rather than building separate controllers for predefined motions, SONIC uses motion tracking as a scalable training task, enabling a single unified policy to produce natural, whole-body movement and support a wide range of behaviors.
Key Features
- ๐ค Unified Whole-Body Control: Single policy handles walking, running, crawling, jumping, manipulation, and more
- ๐ฏ Motion Tracking: Trained on large-scale human motion data for natural movements
- ๐ฎ Real-Time Teleoperation: VR-based whole-body teleoperation via PICO headset
- ๐ Hardware Deployment: C++ inference stack for real-time control on humanoid robots
- ๐จ Kinematic Planner: Real-time locomotion generation with multiple movement styles
- ๐ Multi-Modal Control: Supports keyboard, gamepad, VR, and high-level planning
VR Whole-Body Teleoperation
SONIC supports real-time whole-body teleoperation via PICO VR headset, enabling natural human-to-robot motion transfer for data collection and interactive control.
| Walking | Running |
![]() |
![]() |
| Sideways Movement | Kneeling |
![]() |
![]() |
| Getting Up | Jumping |
![]() |
![]() |
| Bimanual Manipulation | Object Hand-off |
![]() |
![]() |
Kinematic Planner
SONIC includes a kinematic planner for real-time locomotion generation โ choose a movement style, steer with keyboard/gamepad, and adjust speed and height on the fly.
| In-the-Wild Navigation | |
![]() |
|
| Run | Happy |
![]() |
![]() |
| Stealth | Injured |
![]() |
![]() |
| Kneeling | Hand Crawling |
![]() |
![]() |
| Elbow Crawling | Boxing |
![]() |
![]() |
Quick Start
๐ See the Quick Start Guide for step-by-step instructions on:
- Installation and setup
- Running SONIC with different control modes (keyboard, gamepad, VR)
- Deploying on real hardware
- Using the kinematic planner
Key Resources:
- Installation Guide - Complete setup instructions
- Keyboard Control Tutorial - Get started with keyboard control
- Gamepad Control Tutorial - Set up gamepad control
- VR Teleoperation Setup - Full-body VR control
Model Checkpoints
All checkpoints (ONNX format) are available directly in this repository. Inference is powered by TensorRT and runs on both desktop and Jetson hardware.
| Checkpoint | File | Description |
|---|---|---|
| Policy encoder | model_encoder.onnx |
Encodes motion reference into latent |
| Policy decoder | model_decoder.onnx |
Decodes latent into joint actions |
| Kinematic planner | planner_sonic.onnx |
Real-time locomotion style planner |
Quick download (requires pip install huggingface_hub):
from huggingface_hub import snapshot_download
snapshot_download(repo_id="nvidia/GEAR-SONIC", local_dir="gear_sonic_deploy")
Or use the download script from the GitHub repo:
python download_from_hf.py # policy + planner (default)
python download_from_hf.py --no-planner # policy only
See the Download Models guide for full instructions.
Documentation
๐ Full Documentation
Guides
Tutorials
Repository Structure
GR00T-WholeBodyControl/
โโโ gear_sonic_deploy/ # C++ inference stack for deployment
โโโ gear_sonic/ # Teleoperation and data collection tools
โโโ decoupled_wbc/ # Decoupled WBC (GR00T N1.5/N1.6)
โโโ docs/ # Documentation source
โโโ media/ # Videos and images
Related Projects
This repository is part of NVIDIA's GR00T (Generalist Robot 00 Technology) initiative:
- GR00T N1.5: Previous generation decoupled controller
- GR00T N1.6: Improved decoupled WBC approach
- GEAR-SONIC Website: Project page with videos and details
Citation
If you use GEAR-SONIC in your research, please cite:
@article{luo2025sonic,
title={SONIC: Supersizing Motion Tracking for Natural Humanoid Whole-Body Control},
author={Luo, Zhengyi and Yuan, Ye and Wang, Tingwu and Li, Chenran and Chen, Sirui and Casta\~neda, Fernando and Cao, Zi-Ang and Li, Jiefeng and Minor, David and Ben, Qingwei and Da, Xingye and Ding, Runyu and Hogg, Cyrus and Song, Lina and Lim, Edy and Jeong, Eugene and He, Tairan and Xue, Haoru and Xiao, Wenli and Wang, Zi and Yuen, Simon and Kautz, Jan and Chang, Yan and Iqbal, Umar and Fan, Linxi and Zhu, Yuke},
journal={arXiv preprint arXiv:2511.07820},
year={2025}
}
License
This project uses dual licensing:
- Source Code: Apache License 2.0 - applies to all code, scripts, and software components
- Model Weights: NVIDIA Open Model License - applies to all trained model checkpoints
Key points of the NVIDIA Open Model License:
- โ Commercial use permitted with attribution
- โ Modification and distribution allowed
- โ ๏ธ Must comply with NVIDIA's Trustworthy AI terms
- โ ๏ธ Model outputs subject to responsible use guidelines
See LICENSE for complete terms.
Support & Contact
- ๐ง Email: [email protected]
- ๐ Issues: GitHub Issues
- ๐ Documentation: https://nvlabs.github.io/GR00T-WholeBodyControl/
- ๐ Website: https://nvlabs.github.io/GEAR-SONIC/
Acknowledgments
This work builds upon and acknowledges:
- Beyond Mimic - Whole-body tracking foundation
- Isaac Lab - Robot learning framework
- NVIDIA Research GEAR Lab team
- All contributors and collaborators
Model Card Contact
For questions about this model card or responsible AI considerations, contact: [email protected]
















