The Development of a Locomotion Interface for Virtual Reality Terrain Simulation
dc.contributor.author | He, An Chi | en |
dc.contributor.committeechair | Akbari Hamed, Kaveh | en |
dc.contributor.committeechair | Leonessa, Alexander | en |
dc.contributor.committeemember | Losey, Dylan Patrick | en |
dc.contributor.committeemember | Srinivasan, Divya | en |
dc.contributor.department | Mechanical Engineering | en |
dc.date.accessioned | 2025-01-10T09:00:51Z | en |
dc.date.available | 2025-01-10T09:00:51Z | en |
dc.date.issued | 2025-01-09 | en |
dc.description.abstract | Virtual reality (VR) technology has amazed us with its capability of blurring the boundary between real and virtual; it tricks our minds into believing that what we experience is real. This technology is used for entertainment and various applications such as rehabilitation, immersive training, 3D design, etc. However, navigating the VR world remains a significant issue that has yet to be solved. In VR, we often rely on controllers or joysticks to traverse the virtual world. VR navigation with joysticks creates a discrepancy between the visual and bodily senses, which creates extra cognitive load and other problems, such as VR motion sickness, which limit the long-term use of VR applications and prevent it from being widely adopted. Walking is the most common task that we perform every day; it is the most intuitive way of navigation and is desirable to be implemented in VR. This dissertation details the development of a device that enables walking in VR and allows users to experience different terrains. Researchers and companies have pulled off endeavors to enable locomotion in VR; some even achieved commercial success, like the Omni One or KAT VR treadmill. Still, most devices are limited in simulating flat-ground walking without the capability of displaying any terrain, which is far from our living environments and restricts its usage. Our living surroundings feature stairs, bumps, and slopes that those devices can not render. Furthermore, VR can be desirable in hazardous reaction training, which is often situated in highly unstructured environments. Being able to simulate various terrains not only enhances VR immersion but also extends its usage for simulating multiple scenarios that could require extensive cost to construct. This document presents the design, build, control, and evaluation of a robotic locomotion interface that aims to display computer-generated terrains, allowing realistic lower-body engagement. The device features a novel, compact design that allows it to be available in space-constrained places like rehabilitation clinics or smaller labs. It uses design guidelines derived from motion capture data with dynamic simulation to align the robot and human workspace. After that, the system framework is addressed from both hardware and software aspects. This device features open-source Institute for Human and Machine Cognition (IHMC) software integrated with the Simple Open Source EtherCAT Master (SOEM) library to execute real-time EtherCAT communication. An admittance controller has been implemented to achieve smooth physical human-robot interaction (pHRI), governing the robot motion according to user input force. This work presents measurements to evaluate the system's performance. The document presents a CoM (center-of-mass) estimation algorithm that is based on LIP (linear inverted pendulum) model and ZMP (zero-moment point). And the estimation method is further validated through two applications: an initial framework of tele-locomotion and VR interaction. In the first case, it uses estimated CoM motion as a tracking reference for humanoid robots. The second application presents a framework that is able to display virtual terrain in the physical world. | en |
dc.description.abstractgeneral | Virtual reality (VR) technology has shown impressive capability in convincing our minds that what we experience is real, blurring the boundary between real and virtual. This cutting-edge technology is now a cornerstone of entertainment, rehabilitation, immersive training, and product design applications. However, one major challenge remains unsolved: how we move through VR environments. Currently, most VR systems rely on controllers or joysticks for navigation. Unfortunately, this creates a disconnect between what we see and how our bodies move, leading to problems like cognitive overload and motion sickness. These issues limit how long we can use VR and its broader application. This researcher meant to explore a way to make walking a natural part of the VR experience. Walking is something we do effortlessly every day, and it is the most intuitive way to move around. Bringing this natural form of navigation into VR has the potential to significantly enhance the experience. While some devices, like the Omni One or KAT VR treadmill, have gained attention for enabling locomotion in VR, they mostly simulate flat surfaces. This falls short of replicating real-world environments, which often include stairs, slopes, and uneven terrain. Such limitations restrict VR's potential, especially for applications like emergency response training, where navigating complex environments is critical. Simulating realistic terrains could not only make VR more immersive but also open doors to training scenarios that would otherwise be expensive or unsafe to recreate. In this work, we introduce a novel robotic device that allows users to physically walk in VR and experience a variety of terrains. Compactly designed, this device is suitable for space-constrained settings like rehabilitation clinics and small research labs. It incorporates advanced motion-capture data and dynamic simulations to align human movement with the robot's workspace seamlessly. The system uses open-source IHMC software combined with a real-time EtherCAT communication framework for precise control. An admittance controller ensures smooth and responsive interaction between the user and the device. To assess its effectiveness, we developed a comprehensive evaluation framework, measuring how well the system performs and enhances the VR experience. This innovative design allows users to physically interact with computer-generated terrains, extending the possibilities of VR for research, training, and beyond. Ultimately, this work contributes to advancing VR technology, bringing us closer to making immersive, terrain-rich virtual worlds a reality. | en |
dc.description.degree | Doctor of Philosophy | en |
dc.format.medium | ETD | en |
dc.identifier.other | vt_gsexam:41949 | en |
dc.identifier.uri | https://hdl.handle.net/10919/124082 | en |
dc.language.iso | en | en |
dc.publisher | Virginia Tech | en |
dc.rights | In Copyright | en |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | en |
dc.subject | Locomotion Interface | en |
dc.subject | Physical Human-Robot Interaction | en |
dc.subject | Haptic Feedback | en |
dc.subject | Robotics | en |
dc.subject | Contro | en |
dc.title | The Development of a Locomotion Interface for Virtual Reality Terrain Simulation | en |
dc.type | Dissertation | en |
thesis.degree.discipline | Mechanical Engineering | en |
thesis.degree.grantor | Virginia Polytechnic Institute and State University | en |
thesis.degree.level | doctoral | en |
thesis.degree.name | Doctor of Philosophy | en |
Files
Original bundle
1 - 1 of 1