spot_img
HomeResearch & DevelopmentBridging Human and Humanoid Movement: A New Approach to...

Bridging Human and Humanoid Movement: A New Approach to Motion Retargeting

TLDR: Researchers have developed Implicit Kinodynamic Motion Retargeting (IKMR), a new framework that efficiently and scalably converts human motion into physically realistic movements for humanoid robots. By combining kinematic feature learning with dynamic fine-tuning, IKMR enables real-time motion transfer, making it easier to train robots for diverse skills and improving their ability to imitate human actions smoothly and robustly, even with noisy input data.

Teaching humanoid robots to perform complex human-like movements has long been a significant challenge in robotics. The goal of human-to-humanoid imitation learning is to enable robots to learn diverse skills by observing and replicating human motion. However, a crucial hurdle is efficiently and accurately converting human movements into trajectories that a robot can physically execute. Traditional methods often process motion frame by frame, which is slow and doesn’t always account for the robot’s physical limitations, leading to unrealistic movements.

Addressing these limitations, researchers from Xiaomi Robotics Lab, ETH Zurich, and Zhejiang University have introduced a novel framework called Implicit Kinodynamic Motion Retargeting (IKMR). This new approach is designed to be highly efficient and scalable, capable of directly transforming large-scale human motion into robot-executable movements. What makes IKMR stand out is its comprehensive consideration of both kinematics (the geometry of motion) and dynamics (the forces and physics involved).

How IKMR Works

IKMR operates in two main stages: kinematics-aware pretraining and dynamics-aware fine-tuning.

In the kinematics-aware pretraining phase, IKMR learns a shared understanding of motion between humans and humanoid robots. Even though humans and robots have different numbers of joints and bone lengths, they share a similar underlying structure or “topology.” IKMR uses a special neural network architecture, a dual encoder-decoder, to extract and represent these common motion features. It essentially learns to map human motion patterns to robot motion patterns at a fundamental level, ensuring that the essence of the movement is preserved.

Following this, the dynamics-aware fine-tuning stage refines the motion to ensure it’s physically possible for the robot. After the initial kinematic conversion, the system integrates imitation learning within a simulator. Here, the robot practices the retargeted motions, and the system learns to adjust them to be smooth, stable, and dynamically feasible. This stage helps eliminate unrealistic movements like slipping, floating, or sudden, jerky motions that might arise from purely kinematic retargeting. It also focuses on precise contact positions for hands and feet, which is crucial for stable robot interaction with its environment.

Also Read:

Key Advantages and Experimental Results

The researchers conducted extensive experiments using a full-size humanoid robot, the Unitree G1, both in simulation and in the real world, to validate IKMR’s effectiveness. The results highlight several significant advantages:

IKMR achieves an impressive retargeting speed of 5000 frames per second (fps), making it nearly 100 times faster than many traditional numerical optimization techniques. This speed, combined with its ability to process multiple trajectories in parallel, makes it highly scalable for large datasets.

By incorporating dynamics-aware fine-tuning, IKMR produces significantly smoother and more coordinated movements. It reduces undesirable acceleration and jerk in the robot’s joints, leading to more human-like and stable actions.

Human motion data, especially from video capture, can often be noisy. IKMR demonstrates remarkable robustness to such disturbances. It can generate relatively smooth and stable robot motions even when the source human motion contains jitters, which is a critical safety feature for real-time applications like teleoperation.

The framework was successfully deployed on the Unitree G1 robot, enabling it to accurately track and reproduce a variety of human movements, including swinging arms, leg kicks, walking with a human-like gait, and running backward. This demonstrates IKMR’s potential for directly training and deploying whole-body controllers for humanoid robots.

While IKMR represents a significant leap forward, the researchers note a current limitation: it can only convert sequences of a fixed length, meaning longer motions must be split and reassembled, potentially causing minor discontinuities. Future work aims to address this and further explore encoding more diverse robot types into a unified motion representation.

This research, detailed in the paper “Implicit Kinodynamic Motion Retargeting for Human-to-humanoid Imitation Learning”, paves the way for more natural, efficient, and robust human-to-humanoid motion transfer, bringing us closer to humanoid robots that can seamlessly mimic complex human behaviors.

Nikhil Patel
Nikhil Patelhttps://blogs.edgentiq.com
Nikhil Patel is a tech analyst and AI news reporter who brings a practitioner's perspective to every article. With prior experience working at an AI startup, he decodes the business mechanics behind product innovations, funding trends, and partnerships in the GenAI space. Nikhil's insights are sharp, forward-looking, and trusted by insiders and newcomers alike. You can reach him out at: [email protected]

- Advertisement -

spot_img

Gen AI News and Updates

spot_img

- Advertisement -