Adaptive Communication Interfaces for Human-Robot Collaboration

dc.contributor.authorChristie, Benjamin Alexanderen
dc.contributor.committeechairLosey, Dylan Patricken
dc.contributor.committeememberAkbari Hamed, Kavehen
dc.contributor.committeememberAcar, Pinaren
dc.contributor.departmentMechanical Engineeringen
dc.date.accessioned2024-05-08T08:01:31Zen
dc.date.available2024-05-08T08:01:31Zen
dc.date.issued2024-05-07en
dc.description.abstractRobots can use a collection of auditory, visual, or haptic interfaces to convey information to human collaborators. The way these interfaces select signals typically depends on the task that the human is trying to complete: for instance, a haptic wristband may vibrate when the human is moving quickly and stop when the user is stationary. But people interpret the same signals in different ways, so what one user finds intuitive another user may not understand. In the absence of task knowledge, conveying signals is even more difficult: without knowing what the human wants to do, how should the robot select signals that helps them accomplish their task? When paired with the seemingly infinite ways that humans can interpret signals, designing an optimal interface for all users seems impossible. This thesis presents an information-theoretic approach to communication in task-agnostic settings: a unified algorithmic formalism for learning co-adaptive interfaces from scratch without task knowledge. The resulting approach is user-specific and not tied to any interface modality. This method is further improved by introducing symmetrical properties using priors on communication. Although we cannot anticipate how a human will interpret signals, we can anticipate interface properties that humans may like. By integrating these functional priors in the aforementioned learning scheme, we achieve performance far better than baselines that have access to task knowledge. The results presented here indicate that users subjectively prefer interfaces generated from the presented learning scheme while enabling better performance and more efficient interactions.en
dc.description.abstractgeneralThis thesis presents a novel interface for robot-to-human communication that personalizes to the current user without either task-knowledge nor an interpretative model of the human. Suppose that you are trying to find the location of buried treasure in a sandbox. You don't know the location of the treasure, but a robotic assistant does. Unfortunately, the only way the assistant can communicate the position of the treasure to you is through two LEDs of varying intensity --- and neither you nor the robot have a mutually understood interpretation of those signals. Without knowing the robot's convention for communication, how should you interpret the robot's signals? There are infinitely many viable interpretations: perhaps a brighter signal means that the treasure is towards the center of the sandbox -- or something else entirely. The robot has a similar problem: how should it interpret your behavior? Without knowing what you want to do with the hidden information (i.e., your task) or how you behave (i.e., your interpretative model), there is an infinite number pairs for either that fit your behavior. This work presents an interface optimizer that maximizes the correlation between the human's behavior and the hidden information. Testing with real humans indicates that this learning scheme can produce useful communicative mappings --- without knowing the users' tasks or their interpretative models. Furthermore, we recognize that humans have common biases in their interpretation of the world (leading to biases in their interpretations of robot communication). Although we cannot assume how a specific user will interpret an interface's signal, we can assume user-friendly interface designs that most humans find intuitive. We leverage these biases to further improve the aforementioned learning scheme across several user studies. As such, the findings presented in this thesis have a direct impact on human-robot co-adaptation in task-agnostic settings.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:40044en
dc.identifier.urihttps://hdl.handle.net/10919/118917en
dc.language.isoenen
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectInterfacesen
dc.subjectInformation Theoryen
dc.subjectCo-Adaptationen
dc.subjectHuman-Robot Interactionen
dc.titleAdaptive Communication Interfaces for Human-Robot Collaborationen
dc.typeThesisen
thesis.degree.disciplineMechanical Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Christie_BA_T_2024.pdf
Size:
3.11 MB
Format:
Adobe Portable Document Format

Collections