Discipline: Technology and Engineering
Subcategory: Electrical Engineering
Alexis Coates - Georgia Institute of Technology
Co-Author(s): Ayanna Howard, Georgia Institute of Technology, Atlanta, GA
Children with disabilities are oftentimes excluded from their surrounding environments due to their inability to effectively communicate. The lack of inclusion efforts for children with Pervasive Developmental Disorders (PDD) and other communication impairing disabilities further impedes their ability to fully integrate themselves into society. As a result, they face a higher degree of difficulty growing up to be independent and self-sufficient. To enhance their opportunities to effectively communicate, we explore the use of a humanoid robot as a communication mediator. In our research, we express emotions using our humanoid robot platform, Ava, using upper-body arm gestures and LED indicators that correspond to each emotion, with the goal of improving the ability of children with autism to properly communicate their wants and needs to others. In this paper, we discuss associating seven emotions (anger, happiness, sadness, fear, surprise, curiosity, and disgust) with a combination of gestural behaviors and LED indicators using the humanoid robotic platform. To evaluate the human perception of the robot gestures, a single-group design was implemented. Five participants, graduate and undergraduate students in the Human Autonomous Systems (HumAnS) Lab, participated in this study. The population consisted of both males and females within the age range of 18-28 (Female: 1, Male: 4; Undergraduate: 4, Graduate: 1). Each participant was provided a survey to complete, which consisted of a randomized video of the robot gestures and a list of possible choices, including ‘None of the above’ to prevent forced recognition. Each gesture was characterized as G1, G2, G3, G4, G5, G6, and G7, in random order. Once the simulation video was complete, the participants were asked to review an image of a still from the simulation video, alongside an image of Ava executing the same emotion with the addition of the corresponding LED. As a result of surveying emotion perception of five participants, 71% of the emotions achieved a recognition rate of 60% or higher. The most recognized gesture sets were happiness and sadness, both achieving a recognition rate of 80%. The least recognized gesture set was disgust at 20% recognition rate. In the near future, we intend to initialize a dynamic phase for each gesture. This entails continuing the sequence of motion to include additional, fluid movements that will increase the chance of correctly identifying each emotion. These augmented gestural behaviors will be implemented within a mobile application that will allow children to interact through various devices as a means of communicating their emotions to those around them. Increased use of Ava, in conjunction with the proposed mobile application, as a communication and learning tool, will allow children to become more familiar with each emotion.
Funder Acknowledgement(s): Human-Automation Systems (HumAnS) Lab, Georgia Institute of Technology
Faculty Advisor: Ayanna Howard, ayanna.howard@ece.gatech.edu
Role: I conducted the bulk of the research, consulting with my advisor along the way.