VTechWorks staff will be away for the winter holidays starting Tuesday, December 24, 2024, through Wednesday, January 1, 2025, and will not be replying to requests during this time. Thank you for your patience, and happy holidays!
 

Physics-Informed Interpretable Attention-based Machine Learning for Jet Turbine Prediction

TR Number

Date

2024-11-27

Journal Title

Journal ISSN

Volume Title

Publisher

Virginia Tech

Abstract

The prediction of future engine states are useful for performance evaluation and anomaly detection of jet turbines. While a variety of modeling approaches exist, many are not capable of efficiently utilizing the vast quantities of data from test experimentation in a manner that is not opaque to the operator or observer. The literature describes several approaches to interpretable modeling on various types of systems in different domains for applications such as Remaining Useful Life estimation and accident prognosis, but do not perform prediction on measured state quantities or performance. Additionally, of modeling studies that focus on jet turbines, the data is synthetic instead of experimental. In this thesis, we utilize an attention-based neural network, the Temporal Fusion Transformer, on experimental data for prediction, allowing for interpretability and insight into model dynamics. We describe a series of experiments on different configurations of the model architecture and show that through the incorporation of physical information into the system, the models produce better forecasts and confidence qualities on all outputs, with robustness to some level of failure and noise in inputs. For the TFT, we include control inputs as future covariates and evaluate modifications to the loss function to include the physics of key performance parameters of the gas turbine as residual form equations, finding that it increases model accuracy and the usefulness of interpretability results, even when model size is reduced. These key performance parameters were derived from and introduced into the dataset, with a comparison of performance on the full dataset and a reduced dataset showing increased performance on the smaller dataset. Additionally, these interpretable models are able to provide more useful insight into system dynamics, allowing for vision into time horizon attention and model-discovered variable importance. While there is further exploration in the extent of robustness and accuracy of physics-informed attention networks, we expect this approach to lead to models with reduced training time, higher accuracy, increased user confidence in prediction, and more interpretable models which will allow for future incorporation into anomaly detection algorithms or the study of dynamic systems.

Description

Keywords

Machine Learning, Artificial Intelligence, Turbomachinery, Regression

Citation

Collections