Investigating Asymmetric Collaboration and Interaction in Immersive Environments

dc.contributor.authorEnriquez, Danielen
dc.contributor.committeechairYang, Yalongen
dc.contributor.committeememberLee, Sang Wonen
dc.contributor.committeememberNorth, Christopher L.en
dc.contributor.committeememberDavid-John, Brendan Matthewen
dc.contributor.departmentComputer Science and Applicationsen
dc.date.accessioned2024-01-24T09:00:13Zen
dc.date.available2024-01-24T09:00:13Zen
dc.date.issued2024-01-23en
dc.description.abstractWith the commercialization of virtual/augmented reality (VR/AR) devices, there is an increasing interest in combining immersive and non-immersive devices (e.g., desktop computers, mobile devices) for asymmetric collaborations. While such asymmetric settings have been examined in social platforms, questions surrounding collaborative view dimensionalities in data-driven decision-making and interaction from non-immersive devices remain under-explored. A crucial inquiry arises: although presenting a consistent 3D virtual world on both immersive and non-immersive platforms has been a common practice in social applications, does the same guideline apply to lay out data? Or should data placement be optimized locally according to each device's display capacity? To this effect, a user study was conducted to provide empirical insights into the user experience of asymmetric collaboration in data-driven decision-making. The user study tested practical dimensionality combinations between PC and VR, resulting in three conditions: PC2D+VR2D, PC2D+VR3D, and PC3D+VR3D. The results revealed a preference for PC2D+VR3D, and PC2D+VR2D led to the quickest task completion. Similarly, mobile devices have become an inclusive alternative to head-worn displays in virtual reality (VR) environments, enhancing accessibility and allowing cross-device collaboration. Object manipulation techniques in mobile Augmented Reality (AR) have been typically evaluated in table-top scale and we lack an understanding of how these techniques perform in room-scale environments. Two studies were conducted to analyze object translation tasks, each with 30 participants, to investigate how different techniques impact usability and performance for room-scale mobile VR object translations. Results indicated that the Joystick technique, which allowed translation in relation to the user's perspective, was the fastest and most preferred, without difference in precision. These findings provide insight for designing collaborative, asymmetric VR environments.en
dc.description.abstractgeneralWith the commercialization of virtual/augmented reality (VR/AR) devices, there is an increasing interest in combining immersive and non-immersive devices (e.g., desktop computers, mobile devices) for collaborations across different devices. While such asymmetric settings have been examined in social platforms, questions surrounding collaborative view differences in 2D views or 3D views affect data-driven decision-making and interaction remain under-explored. A crucial inquiry arises: although presenting a consistent 3D virtual world on both immersive and non-immersive platforms has been a common practice in social applications, does the same guideline apply to lay out data? Or should data placement be optimized on each device according to each device's display capacity? To this effect, a user study was conducted to provide insights into the user experience of collaboration across different devices in data-driven decision-making. The user study tested different combinations of 2D and 3D layouts between PC and VR, resulting in three conditions: PC2D+VR2D, PC2D+VR3D, and PC3D+VR3D. The results revealed a preference for PC2D+VR3D, and PC2D+VR2D led to the quickest task completion. Similarly, mobile devices have become an inclusive alternative to head-worn displays in virtual reality (VR) environments, enhancing accessibility and allowing cross-device collaboration. Object manipulation techniques in mobile Augmented Reality (AR) have been typically evaluated in table-top scale and we lack an understanding of how these techniques perform in room-scale environments. Two studies were conducted to analyze object translation tasks, each with 30 participants, to investigate how different techniques impact usability and performance for room-scale mobile VR object translations. Results indicated that the Joystick technique, which allowed translation in relation to the user's perspective, was the fastest and most preferred, without difference in precision. These findings provide insight for designing collaborative, asymmetric VR environments.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:38929en
dc.identifier.urihttps://hdl.handle.net/10919/117630en
dc.language.isoenen
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectVirtual Realityen
dc.subjectAsymmetric Collaborationen
dc.subjectHuman-Computer Interactionen
dc.subjectCollaborative Interactionen
dc.subjectMobile Devices: Phones/Tabletsen
dc.titleInvestigating Asymmetric Collaboration and Interaction in Immersive Environmentsen
dc.typeThesisen
thesis.degree.disciplineComputer Science and Applicationsen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Enriquez_D_T_2024.pdf
Size:
12.47 MB
Format:
Adobe Portable Document Format

Collections