Abstract
Among senior adults aged 65 or older in the United States, one in four individuals is considered as a state of social isolation. Adverse effects to diseases and disorders such as dementia, obesity, high blood pressure, heart disease, and stroke, are compounded by isolation, contact restrictions and loneliness. Human emotional state plays an important role in human-human interactions, and it reflects mental states of a human subject in daily lives. Companion robot can provide necessary service for helping people who need emotional support. However, some challenges exist because human feelings and mobile robot have different perceptions in environment. In addition, emotional awareness detection is still difficult from robot perspective, and salient object detection is needed for robot recognizing environment. Furthermore, unstructured environments are additionally challenging where human and companion robots coexist, since both human and robots may locate in different places with different lights and environment noises. To overcome those limitations, in this project, a computational framework is proposed for integrating neurophysiological signal analysis, mobile robot navigation, robotic vision saliency detection, and web- based interface for supporting emotional needs of a human subject by a mobile robot. The proposed method simulates a human subject in an environment for evaluating mobile robot’s responses on emotional detection and salient object recognition. A web-based interface is built for user’s manual interacting with mobile robot. The robot has salient object detection capability so that it can percept both the human partner’s emotion and objects in the environment. Preliminary results show that emotion detection may guide and navigate mobile robot for supporting human emotional needs.