Akshay Parkhi's Weblog

Subscribe

GEAR-SONIC

21st February 2026

GEAR-SONIC (Supersizing Motion Tracking for Natural Humanoid Whole-Body Control) is the big upgrade over the Decoupled WBC approach in the GR00T stack. It’s a completely different approach to humanoid control — unified whole-body, trained on human motion data rather than hand-crafted reward functions.

Decoupled WBC vs SONIC

AspectDecoupled WBCSONIC
ControlsLegs only (15 joints), arms lockedWhole body — all joints including arms
Trained onRL reward functions in Isaac SimLarge-scale human motion data
InputSimple commands (velocity, height)Full-body motion references from video/mocap
ArchitectureSeparate lower body RL + upper body IKSingle unified policy
Used inGR00T N1.5 / N1.6Latest generation

The Decoupled WBC splits the problem: one RL policy handles legs, inverse kinematics handles arms, and they’re stitched together. SONIC replaces all of that with a single neural network that controls every joint simultaneously, trained by watching how humans move.

What Makes SONIC Different

The core idea: instead of designing reward functions that describe what “good walking” looks like, SONIC learns directly from large-scale human motion capture data. The policy takes a full-body motion reference as input and tracks it with the robot’s body — hence “Supersizing Motion Tracking.”

This means the robot can do things that are nearly impossible to specify with reward functions:

  • Natural arm swing while walking
  • Coordinated whole-body gestures
  • Human-like weight shifting and balance recovery
  • Smooth transitions between different movement types

Repository Structure

The SONIC codebase is organized into distinct stages:

GR00T-WholeBodyControl/
├── sonic_policy/    → Policy training (Isaac Sim + RL)
├── sonic_retarget/  → Human motion → robot motion conversion
├── sonic_deploy/    → C++ inference for real hardware deployment

sonic_retarget is the bridge between human motion data and robot-compatible references — it handles the kinematic mapping from a human skeleton to the Unitree G1’s joint configuration. sonic_policy trains the tracking policy in Isaac Sim. sonic_deploy packages it for real-time C++ inference on the actual robot.

Links