Navigating the Unseen: A Haptic Glove for Peripersonal Navigation in Low-Visibility Scenarios
Files
TR Number
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Effective interaction with objects within arm's reach, known as peripersonal navigation, is essential to performing everyday activities as well as specialized tasks in fields such as healthcare, construction, and assistive technology. However, this ability can be substantially impaired in cluttered or unfamiliar environments, or when visual input is limited by fog, smoke, or visual impairments. While augmented reality (AR) and audio-based systems have been explored to enhance spatial awareness, AR devices are often costly and uncomfortable for extended use, and auditory cues can be difficult to interpret in noisy or attention-demanding settings. Vibrotactile feedback, which delivers directional and proximity information through touch, offers a promising alternative. It is discreet, intuitive, and resilient to environmental distractions, making it well suited for guiding movement under low-visibility conditions. Yet, critical questions remain regarding how to design effective vibrotactile cues, identify intuitive guidance strategies, and adapt feedback to users' dynamic behaviors.
To address these gaps, my dissertation pursues three integrated goals centered on developing and evaluating a haptic glove that delivers vibrotactile feedback to support peripersonal navigation in low-visibility scenarios.
Study 1 examines how tactor placement, hand motion, and temporal patterns influence vibrotactile perception. Twenty-two right-handed participants identified vibrating tactors placed on the dorsal hand or wrist while performing controlled hand movements. Results showed that recognition accuracy decreased during hand motion and when vibrations occured with shorter onset intervals, and that stimuli on the hand were more distinguishable than those on the wrist.
Study 2 evaluates vibrotactile guidance strategies using a custom haptic glove. Blindfolded participants navigated toward virtual targets under multiple feedback metaphors and proximity cue designs. Among the tested strategies, the two-tactor vector approach and a "pull" metaphor yielded the fastest target acquisition times and smoothest hand trajectories, demonstrating their potential for intuitive spatial guidance.
Study 3 examines how different temporal feedback patterns—Continuous, Fixed-Interval, Distance-Based, and Behavior-Based (which adjusts timing based on hand speed)—affect navigation performance and user experience in accuracy- and speed-priority scenarios. Participants performed a simulated object-search task under reduced visibility while performance metrics and subjective usability ratings were collected. The Distance-Based temporal pattern showed the stronger overall performance and highest user preference.
Together, these studies advance the understanding of vibrotactile design for spatial guidance and lay the foundation for haptic interfaces that adapt to user needs and environmental conditions. The findings have broad implications for assistive technologies supporting individuals with visual impairments and for applications in search and rescue, firefighting, industrial safety, and immersive AR/VR systems. By refining how tactile cues are structured and delivered, this work contributes to the development of more effective, inclusive, and context-aware haptic navigation systems.