There are thousands of datasets available for various problem statements, yet, when it comes to obtaining a suitable dataset for training robots to communicate and interact with humans, most of the available datasets aren’t viable.
Electronics and Telecommunications Research Institute( ETRI) in South Korea have developed AIR-ACT2ACT, a human to human interaction dataset for teaching non-verbal social behaviors to robots
The dataset contains verbal and non-verbal interactions that will help robots comprehend human interactions to respond accordingly.
100 senior citizens and 2 college students were recruited to perform and record around ten interactions using Microsoft Kinect v2 cameras. The data recording includes depth maps, body indexes, and 3-D skeletal data.
The various interaction scenarios include how to react when the senior citizen:
- Enters the room
- Stands still without reason
- Calls the robot
- Looks at the robot
- Covers his face and cries
- Threatens to hit the robot
- Leave the room
- Shakes the robot’s hands
- Raises his/her hand for high-five
- Asks the robot to go away
The dataset will be useful to train the robots to respond according to the interactions of senior citizens. It can be a beneficial development in different fields, including healthcare, psychiatric care, etc. It can be used to teach social skills to robots and benchmark action recognition algorithms.