AIR-Act2Act: A Dataset For Teaching Non-Verbal Social Behaviors To Robots

There are thousands of datasets available for various problem statements, yet, when it comes to obtaining a suitable dataset for training robots to communicate and interact with humans, most of the available datasets aren’t viable. 

Electronics and Telecommunications Research Institute( ETRI) in South Korea have developed AIR-ACT2ACT, a human to human interaction dataset for teaching non-verbal social behaviors to robots

The dataset contains verbal and non-verbal interactions that will help robots comprehend human interactions to respond accordingly. 

100 senior citizens and 2 college students were recruited to perform and record around ten interactions using Microsoft Kinect v2 cameras. The data recording includes depth maps, body indexes, and 3-D skeletal data.

https://arxiv.org/pdf/2009.02041.pdf

The various interaction scenarios include how to react when the senior citizen: 

  1. Enters the room
  2. Stands still without reason
  3. Calls the robot
  4. Looks at the robot
  5. Covers his face and cries
  6. Threatens to hit the robot
  7. Leave the room
  8. Shakes the robot’s hands
  9. Raises his/her hand for high-five
  10. Asks the robot to go away

The dataset will be useful to train the robots to respond according to the interactions of senior citizens. It can be a beneficial development in different fields, including healthcare, psychiatric care, etc. It can be used to teach social skills to robots and benchmark action recognition algorithms.

https://github.com/ai4r/AIR-Act2Act

Github: https://github.com/ai4r/AIR-Act2Act

Paper: https://arxiv.org/pdf/2009.02041.pdf

Consulting Intern: Grounded and solution--oriented Computer Engineering student with a wide variety of learning experiences. Passionate about learning new technologies and implementing it at the same time.

↗ Step by Step Tutorial on 'How to Build LLM Apps that can See Hear Speak'