Virginia Tech
    • Log in
    View Item 
    •   VTechWorks Home
    • ETDs: Virginia Tech Electronic Theses and Dissertations
    • Doctoral Dissertations
    • View Item
    •   VTechWorks Home
    • ETDs: Virginia Tech Electronic Theses and Dissertations
    • Doctoral Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Walk-Centric User Interfaces for Mixed Reality

    Thumbnail
    View/Open
    Dissertation (10.39Mb)
    Downloads: 3101
    Permissions (80.19Kb)
    Downloads: 67
    Video (82.06Mb)
    Downloads: 639
    Video 2 (158.3Mb)
    Downloads: 208
    Date
    2018-07-31
    Author
    Santos Lages, Wallace
    Metadata
    Show full item record
    Abstract
    Walking is a natural part of our lives and is also becoming increasingly common in mixed reality. Wireless headsets and improved tracking systems allow us to easily navigate real and virtual environments by walking. In spite of the benefits, walking brings challenges to the design of new systems. In particular, designers must be aware of cognitive and motor requirements so that walking does not negatively impact the main task. Unfortunately, those demands are not yet fully understood. In this dissertation, we present new scientific evidence, interaction designs, and analysis of the role of walking in different mixed reality applications. We evaluated the difference in performance of users walking vs. manipulating a dataset during visual analysis. This is an important task, since virtual reality is increasingly being used as a way to make sense of progressively complex datasets. Our findings indicate that neither option is absolutely better: the optimal design choice should consider both user's experience with controllers and user's inherent spatial ability. Participants with reasonable game experience and low spatial ability performed better using the manipulation technique. However, we found that walking can still enable higher performance for participants with low spatial ability and without significant game experience. In augmented reality, specifying points in space is an essential step to create content that is registered with the world. However, this task can be challenging when information about the depth or geometry of the target is not available. We evaluated different augmented reality techniques for point marking that do not rely on any model of the environment. We found that triangulation by physically walking between points provides higher accuracy than purely perceptual methods. However, precision may be affected by head pointing tremors. To increase the precision, we designed a new technique that uses multiple samples to obtain a better estimate of the target position. This technique can also be used to mark points while walking. The effectiveness of this approach was demonstrated with a controlled augmented reality simulation and actual outdoor tests. Moving into the future, augmented reality will eventually replace our mobile devices as the main method of accessing information. Nonetheless, to achieve its full potential, augmented reality interfaces must support the fluid way we move in the world. We investigated the potential of adaptation in achieving this goal. We conceived and implemented an adaptive workspace system, based in the study of the design space and through user contextual studies. Our final design consists in a minimum set of techniques to support mobility and integration with the real world. We also identified a set of key interaction patterns and desirable properties of adaptation-based techniques, which can be used to guide the design of the next-generation walking-centered workspaces.
    General Audience Abstract
    Until recently, walking with virtual and augmented reality headsets was restricted by issues such as excessive weight, cables, tracking limitations, etc. As those limits go away, walking is becoming more common, making the user experience closer to the real world. If well explored, walking can also make some tasks easier and more efficient. Unfortunately, walking reduces our mental and motor performance and its consequences in interface design are not fully understood. In this dissertation, we present studies of the role of walking in three areas: scientific visualization in virtual reality, marking points in augmented reality, and accessing information in augmented reality. We show that although walking reduces our ability to perform those tasks, careful design can reduce its impact in a meaningful way.
    URI
    http://hdl.handle.net/10919/84460
    Collections
    • Doctoral Dissertations [16566]

    If you believe that any material in VTechWorks should be removed, please see our policy and procedure for Requesting that Material be Amended or Removed. All takedown requests will be promptly acknowledged and investigated.

    Virginia Tech | University Libraries | Contact Us
     

     

    VTechWorks

    AboutPoliciesHelp

    Browse

    All of VTechWorksCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    Log inRegister

    Statistics

    View Usage Statistics

    If you believe that any material in VTechWorks should be removed, please see our policy and procedure for Requesting that Material be Amended or Removed. All takedown requests will be promptly acknowledged and investigated.

    Virginia Tech | University Libraries | Contact Us