An indoor fall monitoring system: Robust, multistatic radar sensing and explainable, feature-resonated deep neural network

dc.contributor.authorShen, Menqien
dc.contributor.authorTsui, Kwok-Leungen
dc.contributor.authorNussbaum, Maury A.en
dc.contributor.authorKim, Sunwooken
dc.contributor.authorLure, Flemingen
dc.date.accessioned2023-04-10T13:31:54Zen
dc.date.available2023-04-10T13:31:54Zen
dc.date.issued2023-04en
dc.date.updated2023-04-08T14:28:21Zen
dc.description.abstractIndoor fall monitoring is challenging for community-dwelling older adults due to the need for high accuracy and privacy concerns. Doppler radar is promising, given its low cost and contactless sensing mechanism. However, the line-of-sight restriction limits the application of radar sensing in practice, as the Doppler signature will vary when the sensing angle changes, and signal strength will be substantially degraded with large aspect angles. Additionally, the similarity of the Doppler signatures among different fall types makes it extremely challenging for classification. To address these problems, in this paper we first present a comprehensive experimental study to obtain Doppler radar signals under large and arbitrary aspect angles for diverse types of simulated falls and daily living activities. We then develop a novel, explainable, multi-stream, feature-resonated neural network (eMSFRNet) that achieves fall detection and a pioneering study of classifying seven fall types. eMSFRNet is robust to both radar sensing angles and subjects. It is also the first method that can resonate and enhance feature information from noisy/weak Doppler signatures. The multiple feature extractors - including partial pre-trained layers from ResNet, DenseNet, and VGGNet - extracts diverse feature information with various spatial abstractions from a pair of Doppler signals. The feature-resonated-fusion design translates the multi-stream features to a single salient feature that is critical to fall detection and classification. eMSFRNet achieved 99.3% accuracy detecting falls and 76.8% accuracy for classifying seven fall types. Our work is the first effective multistatic robust sensing system that overcomes the challenges associated with Doppler signatures under large and arbitrary aspect angles, via our comprehensible feature-resonated deep neural network. Our work also demonstrates the potential to accommodate different radar monitoring tasks that demand precise and robust sensing.en
dc.description.versionAccepted versionen
dc.format.extentPages 1-12en
dc.format.mimetypeapplication/pdfen
dc.identifier.doihttps://doi.org/10.1109/JBHI.2023.3237077en
dc.identifier.eissn2168-2208en
dc.identifier.issn2168-2194en
dc.identifier.issue99en
dc.identifier.orcidNussbaum, Maury [0000-0002-1887-8431]en
dc.identifier.orcidKim, Sun Wook [0000-0003-3624-1781]en
dc.identifier.urihttp://hdl.handle.net/10919/114448en
dc.language.isoenen
dc.publisherIEEEen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectFallsen
dc.subjectPatient safetyen
dc.titleAn indoor fall monitoring system: Robust, multistatic radar sensing and explainable, feature-resonated deep neural networken
dc.title.serialIEEE Journal of Biomedical and Health Informaticsen
dc.typeArticle - Refereeden
dc.type.dcmitypeTexten
dc.type.otherArticleen
pubs.organisational-group/Virginia Techen
pubs.organisational-group/Virginia Tech/Engineeringen
pubs.organisational-group/Virginia Tech/Engineering/Industrial and Systems Engineeringen
pubs.organisational-group/Virginia Tech/Faculty of Health Sciencesen
pubs.organisational-group/Virginia Tech/All T&R Facultyen
pubs.organisational-group/Virginia Tech/Engineering/COE T&R Facultyen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Final_Version_12_pages.pdf
Size:
13.76 MB
Format:
Adobe Portable Document Format
Description:
Accepted version