# American Institute of Mathematical Sciences

doi: 10.3934/dcdss.2021098
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

## Augmented Gaussian random field: Theory and computation

 1 Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA 2 Department of Industrial and Systems Engineering, Lehigh University, Bethlehem, PA 18015, USA 3 School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907, USA

Received  March 2021 Revised  June 2021 Early access August 2021

We propose the novel augmented Gaussian random field (AGRF), which is a universal framework incorporating the data of observable and derivatives of any order. Rigorous theory is established. We prove that under certain conditions, the observable and its derivatives of any order are governed by a single Gaussian random field, which is the aforementioned AGRF. As a corollary, the statement "the derivative of a Gaussian process remains a Gaussian process" is validated, since the derivative is represented by a part of the AGRF. Moreover, a computational method corresponding to the universal AGRF framework is constructed. Both noiseless and noisy scenarios are considered. Formulas of the posterior distributions are deduced in a nice closed form. A significant advantage of our computational method is that the universal AGRF framework provides a natural way to incorporate arbitrary order derivatives and deal with missing data. We use four numerical examples to demonstrate the effectiveness of the computational method. The numerical examples are composite function, damped harmonic oscillator, Korteweg-De Vries equation, and Burgers' equation.

Citation: Sheng Zhang, Xiu Yang, Samy Tindel, Guang Lin. Augmented Gaussian random field: Theory and computation. Discrete & Continuous Dynamical Systems - S, doi: 10.3934/dcdss.2021098
##### References:

show all references

##### References:
Graphical illustration of augmented Gaussian random field prediction with measurement noise. There are three layers: input layer, hidden layer, and output layer. The hidden layer is dominated by augmented Gaussian random field. The observable and its derivatives of different orders are integrated into the same field to make predictions
[Composite function (noiseless)] Prediction of the observable, first order derivative, and second order derivative by AGRF. Case 1: the data include the observable only. Case 2: the data include the observable and first order derivative. Case 3: the data include the observable and second order derivative. Case 4: the data include the observable, first order derivative, and second order derivative. AGRF is able to integrate the observable and derivatives of any order, regardless of the location where they are collected. The AGRF prediction improves when more information is available
for more explanations">Figure 3.  [Composite function (noiseless)] Comparison of the prediction accuracy of AGRF in different cases. See Figure 2 for more explanations
[Damped harmonic oscillator (noiseless)] Prediction of the displacement, velocity, and phase-space diagram by different methods. GP: the data include the observable and first order derivative; the observable data are used to predict the displacement and the first order derivative data are used to predict the velocity, respectively. GEK: the data include the observable and first order derivative; all the data are used jointly in the same random field to predict the displacement and velocity at the same time. AGRF: the data include the observable, first order derivative, and second order derivative; all the data are used together in the same random field to predict the displacement and velocity at the same time. GEK produces better prediction than GP, while AGRF predicts more accurately than GEK. By using all the available information together in the same random field, we can construct the most accurate surrogate model
for more explanations">Figure 5.  [Damped harmonic oscillator (noiseless)] Comparison of the prediction accuracy by different methods. See Figure 4 for more explanations
[Korteweg-De Vries equation (noisy)] Top: the solution at $t = 0.5$ is studied. Bottom: prediction of the observable, first order derivative, and second order derivative by AGRF under different levels of noise. AGRF has good performance even when the noise is as high as 40%. As one might expect, the AGRF prediction is better when the noise is lower
for more explanations">Figure 7.  [Korteweg-De Vries equation (noisy)] Comparison of the prediction accuracy under different levels of noise. See Figure 6 for more explanations
[Burgers' equation (noisy)] Top: the solution at $t = 0.5$ is studied. Bottom: prediction of the observable, first order derivative, and second order derivative by different AGRF calibrations. No $\delta$: noiseless formulation is used despite the presence of noise in the data, i.e., $\delta_0 = \delta_1 = \delta_2 = 0$ in Eqn. (87). One $\delta$: the same noise intensity is used for different order derivatives, i.e., $\delta_0 = \delta_1 = \delta_2$ in Eqn. (87). Multiple $\delta$: different noise intensities are used for different order derivatives, i.e., the same as Eqn. (87). When the noiseless formulation is used despite the presence of noise in the data, overfitting is an issue. When the same noise intensity is used for different order derivatives, the uncertainty in the prediction is incompatible with the data since different order derivatives have different scales. When the formulation is exactly the same as Eqn. (87), AGRF has the best performance
for more explanations. The relative $L_2$ errors in the case "no $\delta$" are greater than $1.6$ and out of bound">Figure 9.  [Burgers' equation (noisy)] Comparison of the prediction accuracy by different AGRF calibrations. See Figure 8 for more explanations. The relative $L_2$ errors in the case "no $\delta$" are greater than $1.6$ and out of bound
 [1] Johnathan M. Bardsley. Gaussian Markov random field priors for inverse problems. Inverse Problems & Imaging, 2013, 7 (2) : 397-416. doi: 10.3934/ipi.2013.7.397 [2] John Maclean, Elaine T. Spiller. A surrogate-based approach to nonlinear, non-Gaussian joint state-parameter data assimilation. Foundations of Data Science, 2021, 3 (3) : 589-614. doi: 10.3934/fods.2021019 [3] Rajae Aboulaϊch, Amel Ben Abda, Moez Kallel. Missing boundary data reconstruction via an approximate optimal control. Inverse Problems & Imaging, 2008, 2 (4) : 411-426. doi: 10.3934/ipi.2008.2.411 [4] Peter Giesl, Boumediene Hamzi, Martin Rasmussen, Kevin Webster. Approximation of Lyapunov functions from noisy data. Journal of Computational Dynamics, 2020, 7 (1) : 57-81. doi: 10.3934/jcd.2020003 [5] Alfred K. Louis. Diffusion reconstruction from very noisy tomographic data. Inverse Problems & Imaging, 2010, 4 (4) : 675-683. doi: 10.3934/ipi.2010.4.675 [6] Yan Wang, Lei Wang, Yanxiang Zhao, Aimin Song, Yanping Ma. A stochastic model for microbial fermentation process under Gaussian white noise environment. Numerical Algebra, Control & Optimization, 2015, 5 (4) : 381-392. doi: 10.3934/naco.2015.5.381 [7] Hongjun Gao, Fei Liang. On the stochastic beam equation driven by a Non-Gaussian Lévy process. Discrete & Continuous Dynamical Systems - B, 2014, 19 (4) : 1027-1045. doi: 10.3934/dcdsb.2014.19.1027 [8] Yohei Fujishima. On the effect of higher order derivatives of initial data on the blow-up set for a semilinear heat equation. Communications on Pure & Applied Analysis, 2018, 17 (2) : 449-475. doi: 10.3934/cpaa.2018025 [9] Max-Olivier Hongler. Mean-field games and swarms dynamics in Gaussian and non-Gaussian environments. Journal of Dynamics & Games, 2020, 7 (1) : 1-20. doi: 10.3934/jdg.2020001 [10] Justyna Jarczyk, Witold Jarczyk. Gaussian iterative algorithm and integrated automorphism equation for random means. Discrete & Continuous Dynamical Systems, 2020, 40 (12) : 6837-6844. doi: 10.3934/dcds.2020135 [11] Yuta Tanoue. Improved Hoeffding inequality for dependent bounded or sub-Gaussian random variables. Probability, Uncertainty and Quantitative Risk, 2021, 6 (1) : 53-60. doi: 10.3934/puqr.2021003 [12] Zhenghong Qiu, Jianhui Huang, Tinghan Xie. Linear-Quadratic-Gaussian mean-field controls of social optima. Mathematical Control & Related Fields, 2021  doi: 10.3934/mcrf.2021047 [13] G. Caginalp, Emre Esenturk. Anisotropic phase field equations of arbitrary order. Discrete & Continuous Dynamical Systems - S, 2011, 4 (2) : 311-350. doi: 10.3934/dcdss.2011.4.311 [14] Habib Ammari, Josselin Garnier, Vincent Jugnon. Detection, reconstruction, and characterization algorithms from noisy data in multistatic wave imaging. Discrete & Continuous Dynamical Systems - S, 2015, 8 (3) : 389-417. doi: 10.3934/dcdss.2015.8.389 [15] Xiaoman Liu, Jijun Liu. Image restoration from noisy incomplete frequency data by alternative iteration scheme. Inverse Problems & Imaging, 2020, 14 (4) : 583-606. doi: 10.3934/ipi.2020027 [16] Xuping Xie, Feng Bao, Thomas Maier, Clayton Webster. Analytic continuation of noisy data using Adams Bashforth residual neural network. Discrete & Continuous Dynamical Systems - S, 2021  doi: 10.3934/dcdss.2021088 [17] Müge Acar, Refail Kasimbeyli. A polyhedral conic functions based classification method for noisy data. Journal of Industrial & Management Optimization, 2021, 17 (6) : 3493-3508. doi: 10.3934/jimo.2020129 [18] Bruno Sixou, Cyril Mory. Kullback-Leibler residual and regularization for inverse problems with noisy data and noisy operator. Inverse Problems & Imaging, 2019, 13 (5) : 1113-1137. doi: 10.3934/ipi.2019050 [19] Guangsheng Wei, Hong-Kun Xu. On the missing bound state data of inverse spectral-scattering problems on the half-line. Inverse Problems & Imaging, 2015, 9 (1) : 239-255. doi: 10.3934/ipi.2015.9.239 [20] Jiang Xie, Junfu Xu, Celine Nie, Qing Nie. Machine learning of swimming data via wisdom of crowd and regression analysis. Mathematical Biosciences & Engineering, 2017, 14 (2) : 511-527. doi: 10.3934/mbe.2017031

2020 Impact Factor: 2.425

## Tools

Article outline

Figures and Tables