Wilchek, MatthewWang, LinhanDickinson, SallyFeuerbacher, Erica N.Luther, KurtBatarseh, Feras A.2025-04-042025-04-042025-03-24https://hdl.handle.net/10919/125140In urban search and rescue (USAR) operations, communication between handlers and specially trained canines is crucial but often complicated by challenging environments and the specific behaviors canines are trained to exhibit when detecting a person. Since a USAR canine often works out of sight of the handler, the handler lacks awareness of the canine’s location and situation, known as the “sensemaking gap.” In this paper, we propose KHAIT, a novel approach to close the sensemaking gap and enhance USAR effectiveness by integrating object detection-based Artificial Intelligence (AI) and Augmented Reality (AR). Equipped with AI-powered cameras, edge computing, and AR headsets, KHAIT enables precise and rapid object detection from a canine’s perspective, improving survivor localization. We evaluate this approach in a real-world USAR environment, demonstrating an average survival allocation time decrease of 22%, enhancing the speed and accuracy of operations.application/pdfenCreative Commons Attribution-NonCommercial 4.0 InternationalKHAIT: K-9 Handler Artificial Intelligence Teaming for Collaborative SensemakingArticle - Refereed2025-04-01The author(s)https://doi.org/10.1145/3708359.3712107