The Development of a Locomotion Interface for Virtual Reality Terrain Simulation
Files
TR Number
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Virtual reality (VR) technology has amazed us with its capability of blurring the boundary between real and virtual; it tricks our minds into believing that what we experience is real. This technology is used for entertainment and various applications such as rehabilitation, immersive training, 3D design, etc. However, navigating the VR world remains a significant issue that has yet to be solved. In VR, we often rely on controllers or joysticks to traverse the virtual world. VR navigation with joysticks creates a discrepancy between the visual and bodily senses, which creates extra cognitive load and other problems, such as VR motion sickness, which limit the long-term use of VR applications and prevent it from being widely adopted.
Walking is the most common task that we perform every day; it is the most intuitive way of navigation and is desirable to be implemented in VR. This dissertation details the development of a device that enables walking in VR and allows users to experience different terrains. Researchers and companies have pulled off endeavors to enable locomotion in VR; some even achieved commercial success, like the Omni One or KAT VR treadmill. Still, most devices are limited in simulating flat-ground walking without the capability of displaying any terrain, which is far from our living environments and restricts its usage. Our living surroundings feature stairs, bumps, and slopes that those devices can not render. Furthermore, VR can be desirable in hazardous reaction training, which is often situated in highly unstructured environments. Being able to simulate various terrains not only enhances VR immersion but also extends its usage for simulating multiple scenarios that could require extensive cost to construct.
This document presents the design, build, control, and evaluation of a robotic locomotion interface that aims to display computer-generated terrains, allowing realistic lower-body engagement. The device features a novel, compact design that allows it to be available in space-constrained places like rehabilitation clinics or smaller labs. It uses design guidelines derived from motion capture data with dynamic simulation to align the robot and human workspace. After that, the system framework is addressed from both hardware and software aspects. This device features open-source Institute for Human and Machine Cognition (IHMC) software integrated with the Simple Open Source EtherCAT Master (SOEM) library to execute real-time EtherCAT communication. An admittance controller has been implemented to achieve smooth physical human-robot interaction (pHRI), governing the robot motion according to user input force. This work presents measurements to evaluate the system's performance.
The document presents a CoM (center-of-mass) estimation algorithm that is based on LIP (linear inverted pendulum) model and ZMP (zero-moment point). And the estimation method is further validated through two applications: an initial framework of tele-locomotion and VR interaction. In the first case, it uses estimated CoM motion as a tracking reference for humanoid robots. The second application presents a framework that is able to display virtual terrain in the physical world.