Browsing by Author "Dong, Jiayuan"
Now showing 1 - 11 of 11
Results Per Page
Sort Options
- Bridging the Gap: Early Education on Robot and AI Ethics through the Robot Theater Platform in an Informal Learning EnvironmentMitchell, Jennifer; Dong, Jiayuan; Yu, Shuqi; Harmon, Madison; Holstein, Alethia; Shim, Joon Hyun; Choi, Koeun; Zhu, Qin; Jeon, Myounghoon (ACM, 2024-03-11)With the rapid advancement of robotics and AI, educating the next generation on ethical coexistence with these technologies is crucial. Our research explored the potential of a child-robot theater afterschool program in introducing and discussing robot and AI ethics with elementary school children. Conducted with 30 participants from a socioeconomically underprivileged school, the program blended STEM (Science, Technology, Engineering & Mathematics) with the arts, focusing on ethical issues in robotics and AI. Using interactive scenarios and a theatrical performance, the program aimed to enhance children’s understanding of major ethical issues in robotics and AI, such as bias, transparency, privacy, usage, and responsibility. Preliminary findings indicate the program’s success in engaging children in meaningful ethical discussions, demonstrating the potential of innovative, interactive educational methods in early education. This study contributes significantly to integrating ethical robotics and AI in early learning, preparing young minds for a technologically advanced and socially responsible future.
- The Effects of Robot Voices and Appearances on Users’ Emotion Recognition and Subjective PerceptionKo, Sangjin; Barnes, Jaclyn; Dong, Jiayuan; Park, Chunghyuk; Howard, Ayanna; Jeon, Myounghoon (World Scientific, 2023-02-22)As the influence of social robots in people's daily lives grows, research on understanding people's perception of robots including sociability, trust, acceptance, and preference becomes more pervasive. Research has considered visual, vocal, or tactile cues to express robots' emotions, whereas little research has provided a holistic view in examining the interactions among different factors influencing emotion perception. We investigated multiple facets of user perception on robots during a conversational task by varying the robots' voice types, appearances, and emotions. In our experiment, 20 participants interacted with two robots having four different voice types. While participants were reading fairy tales to the robot, the robot gave vocal feedback with seven emotions and the participants evaluated the robot's profiles through post surveys. The results indicate that (1) the accuracy of emotion perception differed depending on presented emotions, (2) a regular human voice showed higher user preferences and naturalness, (3) but a characterized voice was more appropriate for expressing emotions with significantly higher accuracy in emotion perception, and (4) participants showed significantly higher emotion recognition accuracy with the animal robot than the humanoid robot. A follow-up study (N=10) with voice-only conditions confirmed that the importance of embodiment. The results from this study could provide the guidelines needed to design social robots that consider emotional aspects in conversations between robots and users.
- Emotion GaRage Vol. III: A Workshop on Affective In-Vehicle Display ApplicationsNadri, Chihab; Dong, Jiayuan; Li, Jingyi; Alvarez, Ignacio; Jeon, Myounghoon (ACM, 2022)Empathic in-vehicle interfaces can address driver affect and mitigate decreases in driving performance and behavior that are associated with emotional states. Empathic vehicles can detect and employ a variety of intervention modalities to change user affect and improve user experience. Challenges remain in the implementation of such strategies, as a broader established view of practical intervention modalities and strategies is still absent. Therefore, we propose a workshop that aims to bring together researchers and practitioners interested in affective interfaces and in-vehicle technologies as a forum for the development of displays and alternatives suitable to various use case situations in current and future vehicle states. During the workshop, we will focus on a common set of use cases and generate approaches that can suit different user groups. By the end of this workshop, researchers will create a design flowchart for in-vehicle affective display designers when creating displays for an empathic vehicle.
- Emotion GaRage Vol. IV: Creating Empathic In-Vehicle Interfaces with Generative AIs for Automated Vehicle ContextsChoe, Mungyeong; Bosch, Esther; Dong, Jiayuan; Alvarez, Ignacio; Oehl, Michael; Jallais, Christophe; Alsaid, Areen; Nadri, Chihab; Jeon, Myounghoon (ACM, 2023-09-18)This workshop aims to design advanced empathic user interfaces for in-vehicle displays, particularly for high-level automated vehicles (SAE level 3 or higher). Incorporating model-based approaches for understanding human emotion regulation, it seeks to enhance the user-vehicle interaction. A unique aspect of this workshop is the integration of generative artificial intelligence (AI) tools in the design process. The workshop will explore generative AI’s potential in crafting contextual responses and its impact on user experience and interface design. The agenda includes brainstorming on various driving scenarios, developing emotion-oriented intervention methods, and rapid prototyping with AI tools. The anticipated outcome includes practical prototypes of affective user interfaces and insights on the role of AI in designing human-machine interactions. Through this workshop, we hope to contribute to making automated driving more accessible and enjoyable.
- Facial Expressions Increase Emotion Recognition Clarity and Improve Warmth and Attractiveness on a Humanoid Robot without Adding the Uncanny ValleyDong, Jiayuan; Santiago-Anaya, Alex; Jeon, Myounghoon (SAGE, 2023-10-19)With the rising impact of social robots, research on understanding user perceptions and preferences of these robots gains further importance, especially for emotional expressions. However, research on facial expressions and their effects on human-robot interaction has had mixed results. To unpack this issue further, we investigated users’ emotion recognition accuracy and perceptions when interacting with a social robot that displayed emotional facial expressions or not in a storytelling setting. In our experiment, twenty-eight participants received verbal feedback either with or without facial expressions from the robot. Participants showed a significant recognition accuracy effect for emotions and significantly higher clarity for disgust, happiness, and surprise in the facial expression condition than in the no facial expression condition. In addition, participants rated Milo with facial expressions significantly higher than Milo without facial expressions in warmth and attractiveness. No significant differences were found among the rating scores of naturalness. The results from the present study indicated the importance of facial expressions when considering design choices in social robots.
- Happiness and high reliability develop affective trust in in-vehicle agentsZieger, Scott; Dong, Jiayuan; Taylor, Skye; Sanford, Caitlyn; Jeon, Myounghoon (Frontiers, 2023-03)The advancement of Conditionally Automated Vehicles (CAVs) requires research into critical factors to achieve an optimal interaction between drivers and vehicles. The present study investigated the impact of driver emotions and in-vehicle agent (IVA) reliability on drivers' perceptions, trust, perceived workload, situation awareness (SA), and driving performance toward a Level 3 automated vehicle system. Two humanoid robots acted as the in-vehicle intelligent agents to guide and communicate with the drivers during the experiment. Forty-eight college students participated in the driving simulator study. The participants each experienced a 12-min writing task to induce their designated emotion (happy, angry, or neutral) prior to the driving task. Their affective states were measured before the induction, after the induction, and after the experiment by completing an emotion assessment questionnaire. During the driving scenarios, IVAs informed the participants about five upcoming driving events and three of them asked for the participants to take over control. Participants' SA and takeover driving performance were measured during driving; in addition, participants reported their subjective judgment ratings, trust, and perceived workload (NASA-TLX) toward the Level 3 automated vehicle system after each driving scenario. The results suggested that there was an interaction between emotions and agent reliability contributing to the part of affective trust and the jerk rate in takeover performance. Participants in the happy and high reliability conditions were shown to have a higher affective trust and a lower jerk rate than other emotions in the low reliability condition; however, no significant difference was found in the cognitive trust and other driving performance measures. We suggested that affective trust can be achieved only when both conditions met, including drivers' happy emotion and high reliability. Happy participants also perceived more physical demand than angry and neutral participants. Our results indicated that trust depends on driver emotional states interacting with reliability of the system, which suggested future research and design should consider the impact of driver emotions and system reliability on automated vehicles.
- Inside Out: Emotion GaRage Vol. VDong, Jiayuan; Gowda, Nikhil; Wang, Yiyuan; Choe, Mungyeong; Alsaid, Areen; Alvarez, Ignacio; Krome, Sven; Jeon, Myounghoon (ACM, 2024)The rapid advancement of automated vehicles has aroused the curiosity of researchers in the automotive field. Understanding the emotional aspects of this technology is critical to improving human-vehicle interactions. The topics of the proposed workshop will be expanded from internal to external empathetic interface designs of automated vehicles. The workshop will gather researchers and practitioners to brainstorm and design affective internal and external interfaces for automated vehicles, targeting specific use cases within the social context. During the workshop, participants will use an affective design tool and generative AI to prototype affective interface designs in automated vehicles. With this creative approach, we aim to expand the knowledge of affective eHMIs in addition to in-vehicle designs and understand social factors that contribute to the user perceptions of automated vehicles.
- "Play Your Anger": A Report on the Empathic In-vehicle Interface WorkshopDong, Jiayuan; Nadri, Chihab; Alvarez, Ignacio; Diels, Cyriel; Lee, Myeongkyu; Li, Jingyi; Liao, Pei Hsuan; Manger, Carina; Sadeghian, Shadan; Schuß, Martina; Walker, Bruce N.; Walker, Francesco; Wang, Yiyuan; Jeon, Myounghoon (ACM, 2023-09-18)Empathic in-vehicle interfaces are critical in improving user safety and experiences. There has been much research on how to estimate drivers’ affective states, whereas little research has investigated intervention methods that mitigate potential impacts from the driver’s affective states on their driving performance and user experiences. To enhance the development of in-vehicle interfaces considering emotional aspects, we have organized a workshop series to gather automotive user interface experts to discuss this topic at the International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutoUI). The present paper focuses particularly on the intervention methods created by the experts and proposes design recommendations for future empathic in-vehicle interfaces. We hope this work can spark lively discussions on the importance of drivers’ affective states in their user experience of automated vehicles and pose the right direction.
- Robot-Theater Programs for Different Age Groups to Different Age Groups to Promote STEAM Education and Robotics ResearchKo, Sangjin; Swaim, Haley; Sanghavi, Harsh; Dong, Jiayuan; Nadri, Chihab; Jeon, Myounghoon (ACM, 2020)Issues with learner engagement and interest in STEM and robotics fields may be resolved from an interdisciplinary education approach. We have developed a robot-theater framework using interactive robots as a way to integrate the arts into STEM and robotics learning. The present paper shows a breadth of our target populations, ranging from elementary school children to college students. Based on our experiences in conducting four different programs applied to different age groups, we discuss the characteristics, lessons, design considerations, and implications for future work.
- Robots' "Woohoo" and "Argh" Can Enhance Users' Emotional and Social Perceptions: An Exploratory Study on Non-Lexical Vocalizations and Non-Linguistic SoundsLiu, Xiaozhen; Dong, Jiayuan; Jeon, Myounghoon (ACM, 2023-10)As robots have become more pervasive in our everyday life, social aspects of robots have attracted researchers' attention. Because emotions play a crucial role in social interactions, research has been conducted on conveying emotions via speech. Our study sought to investigate the synchronization of multimodal interaction in human-robot interaction (HRI). We conducted a within-subjects exploratory study with 40 participants to investigate the effects of non-speech sounds (natural voice, synthesized voice, musical sound, and no sound) and basic emotions (anger, fear, happiness, sadness, and surprise) on user perception with emotional body gestures of an anthropomorphic robot (Pepper). While listening to a fairytale with the participant, a humanoid robot responded to the story with a recorded emotional non-speech sounds and gestures. Participants showed significantly higher emotion recognition accuracy from the natural voice than from other sounds. The confusion matrix showed that happiness and sadness had the highest emotion recognition accuracy, which is in line with previous research. The natural voice also induced higher trust, naturalness, and preference, compared to other sounds. Interestingly, the musical sound mostly showed lower perception ratings, even compared to the no sound. Results are discussed with design guidelines for emotional cues from social robots and future research directions.
- Taking a Closer Look: Refining Trust and its Impact in HRIDong, Jiayuan; Esterwood, Connor; Ye, Xin; Mitchell, Jennifer J.; Jo, Wonse; Robert, Lionel P.; Park, Chung Hyuk; Jeon, Myounghoon (ACM, 2024)As robots are rapidly integrated into our daily lives, enhancing trust between humans and robots is crucial to accepting robots and the effectiveness of human-robot interaction (HRI). This workshop aims to provide a platform for HRI researchers, practitioners, and students from diverse disciplines to engage in a discussion to define/refine the construct, understand different factors that influence trust in HRI and their impacts, and measure different aspects of trust. The workshop will contribute to building a solid research community on this crucial construct and guiding future research and development of better human-robot interaction.