Wilchek, MatthewLuther, KurtBatarseh, Feras A.2024-10-082024-10-082024https://hdl.handle.net/10919/121304This paper introduces the design and prototype of Ajna, a wearable shared perception system for supporting extreme sensemaking in emergency scenarios. Ajna addresses technical challenges in Augmented Reality (AR) devices, specifically the limitations of depth sensors and cameras. These limitations confine object detection to close proximity and hinder perception beyond immediate surroundings, through obstructions, or across different structural levels, impacting collaborative use. It harnesses the Inertial Measurement Unit (IMU) in AR devices to measure users? relative distances from a set physical point, enabling object detection sharing among multiple users across obstacles like walls and over distances. We tested Ajna's effectiveness in a controlled study with 15 participants simulating emergency situations in a multi-story building. We found that Ajna improved object detection, location awareness, and situational awareness, and reduced search times by 15%. Ajna's performance in simulated environments highlights the potential of artificial intelligence (AI) to enhance sensemaking in critical situations, offering insights for law enforcement, search and rescue, and infrastructure management.application/pdfenIn CopyrightAjna: A Wearable Shared Perception System for Extreme SensemakingArticle - Refereed2024-10-01The author(s)https://doi.org/10.1145/3690829