Show simple item record

dc.contributor.authorDaw, Arkaen
dc.contributor.authorKarpatne, Anujen
dc.description.abstractIn this paper, we explore a novel direction of research in theory-guided data science to develop physics-aware architectures of artificial neural networks (ANNs), where scientific knowledge is baked in the construction of ANN models. While previous efforts in theory-guided data science have used physics-based loss functions to guide the learning of neural network models to generalizable and physically consistent solutions, they do no guarantee that the model predictions will be physically consistent on unseen test instances, especially after slightly perturbing the trained model, as explored in dropout using testing methods for uncertainty quantification (UQ). On the other hand, our physics-aware ANN architecture hard-wires physical relationships in the ANN design, thus resulting in model predictions which always comply with known physics, even after performing dropout during testing for UQ. We provide some initial results to illustrate the effectiveness of our physics-aware neural network architecture in the context of lake temperature modeling, and show that our approach shows significantly lower physical inconsistency as compared to baseline methods.en
dc.relation.ispartofFEED Workshop at Knowledge Discovery and Data Mining Conference (SIGKDD) 2019en
dc.rightsAttribution 4.0 Internationalen
dc.subjecttheory-guided data scienceen
dc.subjectPhysics Informed Neural Networksen
dc.subjectNeural Networksen
dc.subjectLong Short Term Memory (LSTM)en
dc.subjectLake Temperature Modelingen
dc.titlePhysics-aware Architecture of Neural Networks for Uncertainty Quantification: Application in Lake Temperature Modelingen
dc.typeConference proceedingen
dc.contributor.departmentComputer Scienceen

Files in this item


This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International
License: Attribution 4.0 International