GEAR-SONIC
21st February 2026
GEAR-SONIC (Supersizing Motion Tracking for Natural Humanoid Whole-Body Control) is the big upgrade over the Decoupled WBC approach in the GR00T stack. It’s a completely different approach to humanoid control — unified whole-body, trained on human motion data rather than hand-crafted reward functions.
Decoupled WBC vs SONIC
| Aspect | Decoupled WBC | SONIC |
|---|---|---|
| Controls | Legs only (15 joints), arms locked | Whole body — all joints including arms |
| Trained on | RL reward functions in Isaac Sim | Large-scale human motion data |
| Input | Simple commands (velocity, height) | Full-body motion references from video/mocap |
| Architecture | Separate lower body RL + upper body IK | Single unified policy |
| Used in | GR00T N1.5 / N1.6 | Latest generation |
The Decoupled WBC splits the problem: one RL policy handles legs, inverse kinematics handles arms, and they’re stitched together. SONIC replaces all of that with a single neural network that controls every joint simultaneously, trained by watching how humans move.
What Makes SONIC Different
The core idea: instead of designing reward functions that describe what “good walking” looks like, SONIC learns directly from large-scale human motion capture data. The policy takes a full-body motion reference as input and tracks it with the robot’s body — hence “Supersizing Motion Tracking.”
This means the robot can do things that are nearly impossible to specify with reward functions:
- Natural arm swing while walking
- Coordinated whole-body gestures
- Human-like weight shifting and balance recovery
- Smooth transitions between different movement types
Repository Structure
The SONIC codebase is organized into distinct stages:
GR00T-WholeBodyControl/
├── sonic_policy/ → Policy training (Isaac Sim + RL)
├── sonic_retarget/ → Human motion → robot motion conversion
├── sonic_deploy/ → C++ inference for real hardware deployment
sonic_retarget is the bridge between human motion data and robot-compatible references — it handles the kinematic mapping from a human skeleton to the Unitree G1’s joint configuration. sonic_policy trains the tracking policy in Isaac Sim. sonic_deploy packages it for real-time C++ inference on the actual robot.
Links
- Paper: arxiv.org/abs/2511.07820
- Model weights: huggingface.co/nvidia/GEAR-SONIC
- Docs: nvlabs.github.io/GR00T-WholeBodyControl
More recent articles
- OpenUSD: Advanced Patterns and Common Gotchas. - 28th March 2026
- OpenUSD Mastery: From Composition to Pipeline — A SO-101 Arm Journey - 25th March 2026
- Learning OpenUSD — From Curious Questions to Real Understanding - 19th March 2026