视频播放失败,请联系站点管理员!
Human-Robot Interaction Module
OVERVIEW
The Human–Robot Interaction (HRI) module provides precise 3D human-motion measurements to support gesture modelling, humanoid learning, tele-operation research, and robot-interaction studies.
RayKnot captures natural human movement in 3D, giving robotics teams structured datasets they integrate into their own modelling, simulation, and control frameworks.
• Markerless 3D capture of full-body or targeted movements for HRI research
• Clean multi-view video and 3D trajectories for gesture and behaviour modelling
• Compatible with robotics pipelines (ROS, Python, PyTorch) for downstream processing
• Suitable for humanoid labs, tele-operation studies, and embodied-AI environments
• External sync support for pairing with sensors, haptic systems, and robot IO
• Portable setup for lab, simulation, and real-world demonstrations
Key Features
• Humanoid imitation learning and motion replication
• Tele-operation and human-in-the-loop robot control research
• Gesture-driven interaction and communication studies
• Testing and validation of robot perception and movement strategies
Applications
CONTACT US
——
USE CASES
——
SUCCESS STORIES
——