Imad-Eddine NACIRI
← Back to Projects

Real-Time Human Pose Imitation on NAO Robot

Developed a real-time system that enables the NAO humanoid robot to imitate human body poses captured through a camera. Uses MediaPipe for pose estimation and maps human joint angles to NAO's motor commands in real time.

NAO RobotPose EstimationMediaPipeComputer VisionAIRobotics
NAO Robot Pose Imitation

Overview

This project creates a real-time human pose imitation system for the NAO humanoid robot. A camera captures the human operator’s movements, which are processed through a pose estimation pipeline and mapped to corresponding joint commands on the NAO robot, enabling it to mirror human motions in real time.

Key Features

  • Real-Time Pose Detection: Uses MediaPipe Pose for fast and accurate human body landmark detection
  • Joint Angle Mapping: Converts 3D human pose landmarks to NAO-compatible joint angles
  • Smooth Motion: Implements interpolation and filtering for natural robot movements
  • Full Upper Body: Maps arms, hands, and head movements
  • Low Latency: Optimized pipeline for minimal delay between human motion and robot response

System Architecture

  1. Camera Input: Captures video frames of the human operator
  2. Pose Estimation: MediaPipe extracts 33 body landmarks in 3D
  3. Angle Computation: Calculates joint angles from landmark positions using inverse kinematics
  4. NAO Control: Sends joint angle commands to NAO via NAOqi SDK
  5. Safety Limits: Enforces joint limits and smooth transitions

Technologies Used

  • Python
  • MediaPipe Pose
  • NAOqi SDK
  • OpenCV
  • NumPy
  • NAO V6 humanoid robot

Results

The system achieved responsive real-time imitation with the NAO robot accurately reproducing upper-body human poses, demonstrating potential applications in human-robot interaction, teleoperation, and robotics education.

Contributors

  • Imad-Eddine NACIRI