Daw, ArkaKarpatne, Anuj2021-04-262021-04-262019-08-05http://hdl.handle.net/10919/103126In this paper, we explore a novel direction of research in theory-guided data science to develop physics-aware architectures of artificial neural networks (ANNs), where scientific knowledge is baked in the construction of ANN models. While previous efforts in theory-guided data science have used physics-based loss functions to guide the learning of neural network models to generalizable and physically consistent solutions, they do no guarantee that the model predictions will be physically consistent on unseen test instances, especially after slightly perturbing the trained model, as explored in dropout using testing methods for uncertainty quantification (UQ). On the other hand, our physics-aware ANN architecture hard-wires physical relationships in the ANN design, thus resulting in model predictions which always comply with known physics, even after performing dropout during testing for UQ. We provide some initial results to illustrate the effectiveness of our physics-aware neural network architecture in the context of lake temperature modeling, and show that our approach shows significantly lower physical inconsistency as compared to baseline methods.application/pdfenAttribution 4.0 Internationaltheory-guided data sciencePhysics Informed Neural NetworksNeural NetworksLong Short Term Memory (LSTM)Lake Temperature ModelingPhysics-aware Architecture of Neural Networks for Uncertainty Quantification: Application in Lake Temperature ModelingConference proceeding