Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection

In 2000, Kennedy and O’Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process...

Full description

Bibliographic Details
Main Authors: Sascha Ranftl, Gian Marco Melito, Vahid Badeli, Alice Reinbacher-Köstinger, Katrin Ellermann, Wolfgang von der Linden
Format: Article
Language:English
Published: MDPI AG 2019-12-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/22/1/58
id doaj-669a910ec9b44d84ac2b77ef5ace811d
record_format Article
spelling doaj-669a910ec9b44d84ac2b77ef5ace811d2020-11-25T00:29:30ZengMDPI AGEntropy1099-43002019-12-012215810.3390/e22010058e22010058Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic DissectionSascha Ranftl0Gian Marco Melito1Vahid Badeli2Alice Reinbacher-Köstinger3Katrin Ellermann4Wolfgang von der Linden5Institute of Theoretical Physics-Computational Physics, Graz University of Technology, 8010 Graz, AustriaInstitute of Mechanics, Graz University of Technology, 8010 Graz, AustriaInstitute of Fundamentals and Theory in Electrical Engineering, Graz University of Technology, 8010 Graz, AustriaInstitute of Fundamentals and Theory in Electrical Engineering, Graz University of Technology, 8010 Graz, AustriaInstitute of Mechanics, Graz University of Technology, 8010 Graz, AustriaInstitute of Theoretical Physics-Computational Physics, Graz University of Technology, 8010 Graz, AustriaIn 2000, Kennedy and O’Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice’s uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that “learning” or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long.https://www.mdpi.com/1099-4300/22/1/58uncertainty quantificationmulti fidelitygaussian processesprobability theorybayesimpedance cardiographyaortic dissection
collection DOAJ
language English
format Article
sources DOAJ
author Sascha Ranftl
Gian Marco Melito
Vahid Badeli
Alice Reinbacher-Köstinger
Katrin Ellermann
Wolfgang von der Linden
spellingShingle Sascha Ranftl
Gian Marco Melito
Vahid Badeli
Alice Reinbacher-Köstinger
Katrin Ellermann
Wolfgang von der Linden
Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection
Entropy
uncertainty quantification
multi fidelity
gaussian processes
probability theory
bayes
impedance cardiography
aortic dissection
author_facet Sascha Ranftl
Gian Marco Melito
Vahid Badeli
Alice Reinbacher-Köstinger
Katrin Ellermann
Wolfgang von der Linden
author_sort Sascha Ranftl
title Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection
title_short Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection
title_full Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection
title_fullStr Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection
title_full_unstemmed Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection
title_sort bayesian uncertainty quantification with multi-fidelity data and gaussian processes for impedance cardiography of aortic dissection
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2019-12-01
description In 2000, Kennedy and O’Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice’s uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that “learning” or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long.
topic uncertainty quantification
multi fidelity
gaussian processes
probability theory
bayes
impedance cardiography
aortic dissection
url https://www.mdpi.com/1099-4300/22/1/58
work_keys_str_mv AT sascharanftl bayesianuncertaintyquantificationwithmultifidelitydataandgaussianprocessesforimpedancecardiographyofaorticdissection
AT gianmarcomelito bayesianuncertaintyquantificationwithmultifidelitydataandgaussianprocessesforimpedancecardiographyofaorticdissection
AT vahidbadeli bayesianuncertaintyquantificationwithmultifidelitydataandgaussianprocessesforimpedancecardiographyofaorticdissection
AT alicereinbacherkostinger bayesianuncertaintyquantificationwithmultifidelitydataandgaussianprocessesforimpedancecardiographyofaorticdissection
AT katrinellermann bayesianuncertaintyquantificationwithmultifidelitydataandgaussianprocessesforimpedancecardiographyofaorticdissection
AT wolfgangvonderlinden bayesianuncertaintyquantificationwithmultifidelitydataandgaussianprocessesforimpedancecardiographyofaorticdissection
_version_ 1725330830107082752