September  2021, 3(3): 305-330. doi: 10.3934/fods.2020015

Online learning of both state and dynamics using ensemble Kalman filters

CEREA, joint laboratory École des Ponts ParisTech and EDF R & D, Université Paris-Est, Champs-sur-Marne, France

* Corresponding author: Marc Bocquet

Received  June 2020 Revised  August 2020 Published  September 2021 Early access  September 2020

The reconstruction of the dynamics of an observed physical system as a surrogate model has been brought to the fore by recent advances in machine learning. To deal with partial and noisy observations in that endeavor, machine learning representations of the surrogate model can be used within a Bayesian data assimilation framework. However, these approaches require to consider long time series of observational data, meant to be assimilated all together. This paper investigates the possibility to learn both the dynamics and the state online, i.e. to update their estimates at any time, in particular when new observations are acquired. The estimation is based on the ensemble Kalman filter (EnKF) family of algorithms using a rather simple representation for the surrogate model and state augmentation. We consider the implication of learning dynamics online through (ⅰ) a global EnKF, (ⅰ) a local EnKF and (ⅲ) an iterative EnKF and we discuss in each case issues and algorithmic solutions. We then demonstrate numerically the efficiency and assess the accuracy of these methods using one-dimensional, one-scale and two-scale chaotic Lorenz models.

Citation: Marc Bocquet, Alban Farchi, Quentin Malartic. Online learning of both state and dynamics using ensemble Kalman filters. Foundations of Data Science, 2021, 3 (3) : 305-330. doi: 10.3934/fods.2020015
References:
[1]

H. D. I. AbarbanelP. J. Rozdeba and S. Shirman, Machine learning: Deepest learning as statistical data assimilation problems, Neural Computation, 30 (2018), 2025-2055.  doi: 10.1162/neco_a_01094.

[2]

A. AksoyF. Zhang and J. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation in a two-dimensional sea-breeze model, Mon. Wea. Rev., 134 (2006), 2951-2969.  doi: 10.1175/MWR3224.1.

[3]

T. Arcomano, I. Szunyogh, J. Pathak, A. Wikner, B. R. Hunt and E. Ott, A machine learning-based global atmospheric forecast model, Geophys. Res. Lett., 47 (2020), e2020GL087776.

[4]

C. H. BishopB. J. Etherton and S. J. Majumdar, Adaptive sampling with the ensemble transform Kalman filter. Part I: Theoretical aspects, Mon. Wea. Rev., 129 (2001), 420-436.  doi: 10.1175/1520-0493(2001)129<0420:ASWTET>2.0.CO;2.

[5]

C. H. BishopJ. S. Whitaker and L. Lei, Gain form of the ensemble transform Kalman filter and its relevance to satellite data assimilation with model space ensemble covariance localization, Mon. Wea. Rev., 145 (2017), 4575-4592.  doi: 10.1175/MWR-D-17-0102.1.

[6]

C. M. Bishop, Training with noise is equivalent to Tikhonov regularization, Neural Computation, 7 (1995), 108-116.  doi: 10.1162/neco.1995.7.1.108.

[7]

M. Bocquet, Ensemble Kalman filtering without the intrinsic need for inflation, Nonlin. Processes Geophys., 18 (2011), 735-750.  doi: 10.5194/npg-18-735-2011.

[8]

M. Bocquet, Localization and the iterative ensemble Kalman smoother, Q. J. R. Meteorol. Soc., 142 (2016), 1075-1089.  doi: 10.1002/qj.2711.

[9]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Data assimilation as a learning tool to infer ordinary differential equation representations of dynamical models, Nonlin. Processes Geophys., 26 (2019), 143-162.  doi: 10.5194/npg-26-143-2019.

[10]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization, Foundations of Data Science, 2 (2020), 55-80.  doi: 10.3934/fods.2020004.

[11]

M. Bocquet and A. Carrassi, Four-dimensional ensemble variational data assimilation and the unstable subspace, Tellus A, 69 (2017), 1304504. doi: 10.1080/16000870.2017.1304504.

[12]

M. Bocquet and A. Farchi, On the consistency of the perturbation update of local ensemble square root Kalman filters, Tellus A, 71 (2019), 1-21.  doi: 10.1080/16000870.2019.1613142.

[13]

M. BocquetK. S. GurumoorthyA. ApteA. CarrassiC. Grudzien and C. K. R. T. Jones, Degenerate Kalman filter error covariances and their convergence onto the unstable subspace, SIAM/ASA J. Uncertainty Quantification, 5 (2017), 304-333.  doi: 10.1137/16M1068712.

[14]

M. Bocquet and P. Sakov, Combining inflation-free and iterative ensemble Kalman filters for strongly nonlinear systems, Nonlin. Processes Geophys., 19 (2012), 383-399.  doi: 10.5194/npg-19-383-2012.

[15]

M. Bocquet and P. Sakov, Joint state and parameter estimation with an iterative ensemble Kalman smoother, Nonlin. Processes Geophys., 20 (2013), 803-818.  doi: 10.5194/npg-20-803-2013.

[16]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to emulate a dynamical model from sparse and noisy observations: A case study with the Lorenz 96 model, J. Comput. Sci., 44 (2020), 101171. doi: 10.1016/j.jocs.2020.101171.

[17]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to infer unresolved scale parametrisation, Philosophical Transactions A, 0 (2020), 0, Submitted, arXiv preprint: arXiv: 2009.04318.

[18]

S. L. BruntonJ. L. Proctor and J. N. Kutz, Discovering governing equations from data by sparse identification of nonlinear dynamical systems, PNAS, 113 (2016), 3932-3937.  doi: 10.1073/pnas.1517384113.

[19]

M. CarluF. GinelliV. Lucarini and A. Politi, Lyapunov analysis of multiscale dynamics: The slow bundle of the two-scale Lorenz 96 model, Nonlin. Processes Geophys., 26 (2019), 73-89.  doi: 10.5194/npg-26-73-2019.

[20]

A. Carrassi, M. Bocquet, L. Bertino and G. Evensen, Data assimilation in the geosciences: An overview on methods, issues, and perspectives, WIREs Climate Change, 9 (2018), e535. doi: 10.1002/wcc.535.

[21]

C. L. DefforgeB. CarissimoM. BocquetR. Bresson and P. Armand, Improving CFD atmospheric simulations at local scale for wind resource assessment using the iterative ensemble Kalman smoother, J. Wind. Eng. Ind. Aerod., 189 (2019), 243-257.  doi: 10.1016/j.jweia.2019.03.030.

[22]

P. D. Dueben and P. Bauer, Challenges and design choices for global weather and climate models based on machine learning, Geosci. Model Dev., 11 (2018), 3999-4009.  doi: 10.5194/gmd-11-3999-2018.

[23]

G. Evensen, Data Assimilation: The Ensemble Kalman Filter, 2$^nd$ edition, Springer-Verlag Berlin Heildelberg, 2009. doi: 10.1007/978-3-642-03711-5.

[24]

R. Fablet, S. Ouala and C. Herzet, Bilinear residual neural network for the identification and forecasting of dynamical systems, in EUSIPCO 2018, European Signal Processing Conference, Rome, Italy, 2018, 1–5.

[25]

A. Farchi and M. Bocquet, On the efficiency of covariance localisation of the ensemble Kalman filter using augmented ensembles, Front. Appl. Math. Stat., 5 (2019), 3. doi: 10.3389/fams.2019.00003.

[26]

E. Fertig, Observation bias correction with an ensemble Kalman filter, Tellus A, 61 (2009), 210-226. 

[27]

A. FillionM. BocquetS. GrattonS. Gürol and P. Sakov, An iterative ensemble Kalman smoother in presence of additive model error, SIAM/ASA J. Uncertainty Quantification, 8 (2020), 198-228.  doi: 10.1137/19M1244147.

[28]

G. Gaspari and S. E. Cohn, Construction of correlation functions in two and three dimensions, Q. J. R. Meteorol. Soc., 125 (1999), 723-757.  doi: 10.1002/qj.49712555417.

[29]

C. GrudzienA. Carrassi and M. Bocquet, Chaotic dynamics and the role of covariance inflation for reduced rank Kalman filters with model error, Nonlin. Processes Geophys., 25 (2018), 633-648.  doi: 10.5194/npg-25-633-2018.

[30]

T. M. HamillJ. S. Whitaker and C. Snyder, Distance-dependent filtering of background error covariance estimates in an ensemble Kalman filter, Mon. Wea. Rev., 129 (2001), 2776-2790.  doi: 10.1175/1520-0493(2001)129<2776:DDFOBE>2.0.CO;2.

[31]

R. A. Horn and C. R. Johnson, Matrix Analysis, 2$^{nd}$ edition, Cambridge University Press, 2013.

[32]

P. L. Houtekamer and H. L. Mitchell, A sequential ensemble Kalman filter for atmospheric data assimilation, Mon. Wea. Rev., 129 (2001), 123-137.  doi: 10.1175/1520-0493(2001)129<0123:ASEKFF>2.0.CO;2.

[33]

W. W. Hsieh and B. Tang, Applying neural network models to prediction and data analysis in meteorology and oceanography, Bull. Amer. Meteor. Soc., 79 (1998), 1855-1870.  doi: 10.1175/1520-0477(1998)079<1855:ANNMTP>2.0.CO;2.

[34]

X.-M. Hu, F. Zhang and J. W. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation for treatment of mesoscale model error: A real-data study, Geophys. Res. Lett., 37 (2010), L08802. doi: 10.1029/2010GL043017.

[35]

B. R. HuntE. J. Kostelich and I. Szunyogh, Efficient data assimilation for spatiotemporal chaos: A local ensemble transform Kalman filter, Physica D, 230 (2007), 112-126.  doi: 10.1016/j.physd.2006.11.008.

[36] A. H. Jazwinski, Stochastic Processes and Filtering Theory, Academic Press, New-York, 1970. 
[37]

N. B. Kovachki and A. M. Stuart, Ensemble Kalman inversion: A derivative-free technique for machine learning tasks, Inverse Problems, 35 (2019), 095005. doi: 10.1088/1361-6420/ab1c3a.

[38]

H. Koyama and M. Watanabe, Reducing forecast errors due to model imperfections using ensemble Kalman filtering, Mon. Wea. Rev., 138 (2010), 3316-3332.  doi: 10.1175/2010MWR3067.1.

[39]

R. LguensatP. TandeoP. AilliotM. Pulido and R. Fablet, The analog data assimilation, Mon. Wea. Rev., 145 (2017), 4093-4107.  doi: 10.1175/MWR-D-16-0441.1.

[40]

Z. Long, Y. Lu, X. Ma and B. Dong, PDE-Net: Learning PDEs from data, in Proceedings of the 35th International Conference on Machine Learning, 2018.

[41]

E. N. Lorenz, Designing chaotic models, J. Atmos. Sci., 62 (2005), 1574-1587.  doi: 10.1175/JAS3430.1.

[42]

E. N. Lorenz and K. A. Emanuel, Optimal sites for supplementary weather observations: Simulation with a small model, J. Atmos. Sci., 55 (1998), 399-414.  doi: 10.1175/1520-0469(1998)055<0399:OSFSWO>2.0.CO;2.

[43]

T. Miyoshi, The Gaussian approach to adaptive covariance inflation and its implementation with the local ensemble transform Kalman filter, Mon. Wea. Rev., 139 (2011), 1519-1535.  doi: 10.1175/2010MWR3570.1.

[44]

E. Ott, A local ensemble Kalman filter for atmospheric data assimilation, Tellus A, 56 (2004), 415-428.  doi: 10.1016/j.physd.2006.11.008.

[45]

J. PaduartL. LauwersJ. SweversK. SmoldersJ. Schoukens and R. Pintelon, Identification of nonlinear systems using polynomial nonlinear state space models, Automatica, 46 (2010), 647-656.  doi: 10.1016/j.automatica.2010.01.001.

[46]

J. Pathak, B. Hunt, M. Girvan, Z. Lu and E. Ott, Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach, Phys. Rev. Lett., 120 (2018), 024102. doi: 10.1103/PhysRevLett.120.024102.

[47]

P. N. RaanesM. Bocquet and A. Carrassi, Adaptive covariance inflation in the ensemble Kalman filter by Gaussian scale mixtures, Q. J. R. Meteorol. Soc., 145 (2019), 53-75.  doi: 10.1002/qj.3386.

[48]

P. N. RaanesA. Carrassi and L. Bertino, Extending the square root method to account for additive forecast noise in ensemble methods, Mon. Wea. Rev., 143 (2015), 3857-38730.  doi: 10.1175/MWR-D-14-00375.1.

[49]

S. Rasp, Coupled online learning as a way to tackle instabilities and biases in neural network parameterizations: General algorithms and Lorenz96 case study (v1.0), Geosci. Model Dev., 13 (2020), 2185-2196.  doi: 10.5194/gmd-2019-319.

[50]

Y. M. Ruckstuhl and T. Janjić, Parameter and state estimation with ensemble Kalman filter based algorithms for convective-scale applications, Q. J. R. Meteorol. Soc., 144 (2018), 826-841.  doi: 10.1002/qj.3257.

[51]

J. J. RuizM. Pulido and T. Miyoshi, Estimating model parameters with ensemble-based data assimilation: A review, J. Meteorol. Soc. Japan, 91 (2013), 79-99.  doi: 10.2151/jmsj.2013-201.

[52]

P. Sakov and L. Bertino, Relation between two common localisation methods for the EnKF, Comput. Geosci., 15 (2011), 225-237.  doi: 10.1007/s10596-010-9202-6.

[53]

P. SakovJ.-M. Haussaire and M. Bocquet, An iterative ensemble Kalman filter in presence of additive model error, Q. J. R. Meteorol. Soc., 144 (2018), 1297-1309.  doi: 10.1002/qj.3213.

[54]

P. SakovD. S. Oliver and L. Bertino, An iterative EnKF for strongly nonlinear systems, Mon. Wea. Rev., 140 (2012), 1988-2004.  doi: 10.1175/MWR-D-11-00176.1.

[55]

S. Scher and G. Messori, Generalization properties of feed-forward neural networks trained on Lorenz systems, Nonlin. Processes Geophys., 26 (2019), 381-399.  doi: 10.5194/npg-26-381-2019.

[56]

J. A. WeynD. R. Durran and R. Caruana, Using deep learning to predict gridded 500-hPa geopotential height from historical weather data, Journal of Advances in Modeling Earth Systems, 11 (2019), 2680-2693. 

[57]

J. S. Whitaker and T. M. Hamill, Ensemble data assimilation without perturbed observations, Mon. Wea. Rev., 130 (2002), 1913-1924.  doi: 10.1175/1520-0493(2002)130<1913:EDAWPO>2.0.CO;2.

show all references

References:
[1]

H. D. I. AbarbanelP. J. Rozdeba and S. Shirman, Machine learning: Deepest learning as statistical data assimilation problems, Neural Computation, 30 (2018), 2025-2055.  doi: 10.1162/neco_a_01094.

[2]

A. AksoyF. Zhang and J. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation in a two-dimensional sea-breeze model, Mon. Wea. Rev., 134 (2006), 2951-2969.  doi: 10.1175/MWR3224.1.

[3]

T. Arcomano, I. Szunyogh, J. Pathak, A. Wikner, B. R. Hunt and E. Ott, A machine learning-based global atmospheric forecast model, Geophys. Res. Lett., 47 (2020), e2020GL087776.

[4]

C. H. BishopB. J. Etherton and S. J. Majumdar, Adaptive sampling with the ensemble transform Kalman filter. Part I: Theoretical aspects, Mon. Wea. Rev., 129 (2001), 420-436.  doi: 10.1175/1520-0493(2001)129<0420:ASWTET>2.0.CO;2.

[5]

C. H. BishopJ. S. Whitaker and L. Lei, Gain form of the ensemble transform Kalman filter and its relevance to satellite data assimilation with model space ensemble covariance localization, Mon. Wea. Rev., 145 (2017), 4575-4592.  doi: 10.1175/MWR-D-17-0102.1.

[6]

C. M. Bishop, Training with noise is equivalent to Tikhonov regularization, Neural Computation, 7 (1995), 108-116.  doi: 10.1162/neco.1995.7.1.108.

[7]

M. Bocquet, Ensemble Kalman filtering without the intrinsic need for inflation, Nonlin. Processes Geophys., 18 (2011), 735-750.  doi: 10.5194/npg-18-735-2011.

[8]

M. Bocquet, Localization and the iterative ensemble Kalman smoother, Q. J. R. Meteorol. Soc., 142 (2016), 1075-1089.  doi: 10.1002/qj.2711.

[9]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Data assimilation as a learning tool to infer ordinary differential equation representations of dynamical models, Nonlin. Processes Geophys., 26 (2019), 143-162.  doi: 10.5194/npg-26-143-2019.

[10]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization, Foundations of Data Science, 2 (2020), 55-80.  doi: 10.3934/fods.2020004.

[11]

M. Bocquet and A. Carrassi, Four-dimensional ensemble variational data assimilation and the unstable subspace, Tellus A, 69 (2017), 1304504. doi: 10.1080/16000870.2017.1304504.

[12]

M. Bocquet and A. Farchi, On the consistency of the perturbation update of local ensemble square root Kalman filters, Tellus A, 71 (2019), 1-21.  doi: 10.1080/16000870.2019.1613142.

[13]

M. BocquetK. S. GurumoorthyA. ApteA. CarrassiC. Grudzien and C. K. R. T. Jones, Degenerate Kalman filter error covariances and their convergence onto the unstable subspace, SIAM/ASA J. Uncertainty Quantification, 5 (2017), 304-333.  doi: 10.1137/16M1068712.

[14]

M. Bocquet and P. Sakov, Combining inflation-free and iterative ensemble Kalman filters for strongly nonlinear systems, Nonlin. Processes Geophys., 19 (2012), 383-399.  doi: 10.5194/npg-19-383-2012.

[15]

M. Bocquet and P. Sakov, Joint state and parameter estimation with an iterative ensemble Kalman smoother, Nonlin. Processes Geophys., 20 (2013), 803-818.  doi: 10.5194/npg-20-803-2013.

[16]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to emulate a dynamical model from sparse and noisy observations: A case study with the Lorenz 96 model, J. Comput. Sci., 44 (2020), 101171. doi: 10.1016/j.jocs.2020.101171.

[17]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to infer unresolved scale parametrisation, Philosophical Transactions A, 0 (2020), 0, Submitted, arXiv preprint: arXiv: 2009.04318.

[18]

S. L. BruntonJ. L. Proctor and J. N. Kutz, Discovering governing equations from data by sparse identification of nonlinear dynamical systems, PNAS, 113 (2016), 3932-3937.  doi: 10.1073/pnas.1517384113.

[19]

M. CarluF. GinelliV. Lucarini and A. Politi, Lyapunov analysis of multiscale dynamics: The slow bundle of the two-scale Lorenz 96 model, Nonlin. Processes Geophys., 26 (2019), 73-89.  doi: 10.5194/npg-26-73-2019.

[20]

A. Carrassi, M. Bocquet, L. Bertino and G. Evensen, Data assimilation in the geosciences: An overview on methods, issues, and perspectives, WIREs Climate Change, 9 (2018), e535. doi: 10.1002/wcc.535.

[21]

C. L. DefforgeB. CarissimoM. BocquetR. Bresson and P. Armand, Improving CFD atmospheric simulations at local scale for wind resource assessment using the iterative ensemble Kalman smoother, J. Wind. Eng. Ind. Aerod., 189 (2019), 243-257.  doi: 10.1016/j.jweia.2019.03.030.

[22]

P. D. Dueben and P. Bauer, Challenges and design choices for global weather and climate models based on machine learning, Geosci. Model Dev., 11 (2018), 3999-4009.  doi: 10.5194/gmd-11-3999-2018.

[23]

G. Evensen, Data Assimilation: The Ensemble Kalman Filter, 2$^nd$ edition, Springer-Verlag Berlin Heildelberg, 2009. doi: 10.1007/978-3-642-03711-5.

[24]

R. Fablet, S. Ouala and C. Herzet, Bilinear residual neural network for the identification and forecasting of dynamical systems, in EUSIPCO 2018, European Signal Processing Conference, Rome, Italy, 2018, 1–5.

[25]

A. Farchi and M. Bocquet, On the efficiency of covariance localisation of the ensemble Kalman filter using augmented ensembles, Front. Appl. Math. Stat., 5 (2019), 3. doi: 10.3389/fams.2019.00003.

[26]

E. Fertig, Observation bias correction with an ensemble Kalman filter, Tellus A, 61 (2009), 210-226. 

[27]

A. FillionM. BocquetS. GrattonS. Gürol and P. Sakov, An iterative ensemble Kalman smoother in presence of additive model error, SIAM/ASA J. Uncertainty Quantification, 8 (2020), 198-228.  doi: 10.1137/19M1244147.

[28]

G. Gaspari and S. E. Cohn, Construction of correlation functions in two and three dimensions, Q. J. R. Meteorol. Soc., 125 (1999), 723-757.  doi: 10.1002/qj.49712555417.

[29]

C. GrudzienA. Carrassi and M. Bocquet, Chaotic dynamics and the role of covariance inflation for reduced rank Kalman filters with model error, Nonlin. Processes Geophys., 25 (2018), 633-648.  doi: 10.5194/npg-25-633-2018.

[30]

T. M. HamillJ. S. Whitaker and C. Snyder, Distance-dependent filtering of background error covariance estimates in an ensemble Kalman filter, Mon. Wea. Rev., 129 (2001), 2776-2790.  doi: 10.1175/1520-0493(2001)129<2776:DDFOBE>2.0.CO;2.

[31]

R. A. Horn and C. R. Johnson, Matrix Analysis, 2$^{nd}$ edition, Cambridge University Press, 2013.

[32]

P. L. Houtekamer and H. L. Mitchell, A sequential ensemble Kalman filter for atmospheric data assimilation, Mon. Wea. Rev., 129 (2001), 123-137.  doi: 10.1175/1520-0493(2001)129<0123:ASEKFF>2.0.CO;2.

[33]

W. W. Hsieh and B. Tang, Applying neural network models to prediction and data analysis in meteorology and oceanography, Bull. Amer. Meteor. Soc., 79 (1998), 1855-1870.  doi: 10.1175/1520-0477(1998)079<1855:ANNMTP>2.0.CO;2.

[34]

X.-M. Hu, F. Zhang and J. W. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation for treatment of mesoscale model error: A real-data study, Geophys. Res. Lett., 37 (2010), L08802. doi: 10.1029/2010GL043017.

[35]

B. R. HuntE. J. Kostelich and I. Szunyogh, Efficient data assimilation for spatiotemporal chaos: A local ensemble transform Kalman filter, Physica D, 230 (2007), 112-126.  doi: 10.1016/j.physd.2006.11.008.

[36] A. H. Jazwinski, Stochastic Processes and Filtering Theory, Academic Press, New-York, 1970. 
[37]

N. B. Kovachki and A. M. Stuart, Ensemble Kalman inversion: A derivative-free technique for machine learning tasks, Inverse Problems, 35 (2019), 095005. doi: 10.1088/1361-6420/ab1c3a.

[38]

H. Koyama and M. Watanabe, Reducing forecast errors due to model imperfections using ensemble Kalman filtering, Mon. Wea. Rev., 138 (2010), 3316-3332.  doi: 10.1175/2010MWR3067.1.

[39]

R. LguensatP. TandeoP. AilliotM. Pulido and R. Fablet, The analog data assimilation, Mon. Wea. Rev., 145 (2017), 4093-4107.  doi: 10.1175/MWR-D-16-0441.1.

[40]

Z. Long, Y. Lu, X. Ma and B. Dong, PDE-Net: Learning PDEs from data, in Proceedings of the 35th International Conference on Machine Learning, 2018.

[41]

E. N. Lorenz, Designing chaotic models, J. Atmos. Sci., 62 (2005), 1574-1587.  doi: 10.1175/JAS3430.1.

[42]

E. N. Lorenz and K. A. Emanuel, Optimal sites for supplementary weather observations: Simulation with a small model, J. Atmos. Sci., 55 (1998), 399-414.  doi: 10.1175/1520-0469(1998)055<0399:OSFSWO>2.0.CO;2.

[43]

T. Miyoshi, The Gaussian approach to adaptive covariance inflation and its implementation with the local ensemble transform Kalman filter, Mon. Wea. Rev., 139 (2011), 1519-1535.  doi: 10.1175/2010MWR3570.1.

[44]

E. Ott, A local ensemble Kalman filter for atmospheric data assimilation, Tellus A, 56 (2004), 415-428.  doi: 10.1016/j.physd.2006.11.008.

[45]

J. PaduartL. LauwersJ. SweversK. SmoldersJ. Schoukens and R. Pintelon, Identification of nonlinear systems using polynomial nonlinear state space models, Automatica, 46 (2010), 647-656.  doi: 10.1016/j.automatica.2010.01.001.

[46]

J. Pathak, B. Hunt, M. Girvan, Z. Lu and E. Ott, Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach, Phys. Rev. Lett., 120 (2018), 024102. doi: 10.1103/PhysRevLett.120.024102.

[47]

P. N. RaanesM. Bocquet and A. Carrassi, Adaptive covariance inflation in the ensemble Kalman filter by Gaussian scale mixtures, Q. J. R. Meteorol. Soc., 145 (2019), 53-75.  doi: 10.1002/qj.3386.

[48]

P. N. RaanesA. Carrassi and L. Bertino, Extending the square root method to account for additive forecast noise in ensemble methods, Mon. Wea. Rev., 143 (2015), 3857-38730.  doi: 10.1175/MWR-D-14-00375.1.

[49]

S. Rasp, Coupled online learning as a way to tackle instabilities and biases in neural network parameterizations: General algorithms and Lorenz96 case study (v1.0), Geosci. Model Dev., 13 (2020), 2185-2196.  doi: 10.5194/gmd-2019-319.

[50]

Y. M. Ruckstuhl and T. Janjić, Parameter and state estimation with ensemble Kalman filter based algorithms for convective-scale applications, Q. J. R. Meteorol. Soc., 144 (2018), 826-841.  doi: 10.1002/qj.3257.

[51]

J. J. RuizM. Pulido and T. Miyoshi, Estimating model parameters with ensemble-based data assimilation: A review, J. Meteorol. Soc. Japan, 91 (2013), 79-99.  doi: 10.2151/jmsj.2013-201.

[52]

P. Sakov and L. Bertino, Relation between two common localisation methods for the EnKF, Comput. Geosci., 15 (2011), 225-237.  doi: 10.1007/s10596-010-9202-6.

[53]

P. SakovJ.-M. Haussaire and M. Bocquet, An iterative ensemble Kalman filter in presence of additive model error, Q. J. R. Meteorol. Soc., 144 (2018), 1297-1309.  doi: 10.1002/qj.3213.

[54]

P. SakovD. S. Oliver and L. Bertino, An iterative EnKF for strongly nonlinear systems, Mon. Wea. Rev., 140 (2012), 1988-2004.  doi: 10.1175/MWR-D-11-00176.1.

[55]

S. Scher and G. Messori, Generalization properties of feed-forward neural networks trained on Lorenz systems, Nonlin. Processes Geophys., 26 (2019), 381-399.  doi: 10.5194/npg-26-381-2019.

[56]

J. A. WeynD. R. Durran and R. Caruana, Using deep learning to predict gridded 500-hPa geopotential height from historical weather data, Journal of Advances in Modeling Earth Systems, 11 (2019), 2680-2693. 

[57]

J. S. Whitaker and T. M. Hamill, Ensemble data assimilation without perturbed observations, Mon. Wea. Rev., 130 (2002), 1913-1924.  doi: 10.1175/1520-0493(2002)130<1913:EDAWPO>2.0.CO;2.

Figure 1.  Average state RMSE for the EnKF-ML with an hybrid adaptive inflation scheme, applied to the L96 model, for a range of ensemble size $ {N_\mathrm{e}} $ and several values of the standard deviations of the initial parameter vector mean. The error bars correspond to the standard deviation of $ {N_\mathrm{exp}} = 100 $ repeated experiments
Figure 2.  Initial evolution of the $ {N_\mathrm{p}} = 18 $ parameters of the surrogate model learned on the observation of a L96 model run. The key parameters are the forcing ($ F = 8 $ in the true model), the friction ($ -1 $ in the true model) and the advection coefficients ($ -1, 1 $ in the true model)
Figure 3.  Left panel (a): For both models, comparison of the performance of the EnKF-ML where the model is unknown and of the traditional EnKF where the model is known. The error bars are obtained from the standard deviation of $ {N_\mathrm{exp}} = 50 $ repeated experiments. Right panel (b): Comparison of Lyapunov spectra of the L96 model and of the surrogate model about an L96 trajectory
Figure 4.  For both models, comparison of the performance of the LEnKF-ML where the model is unknown and of the traditional LEnKF where the model is known. The error bars are obtained from the standard deviation of $ {N_\mathrm{exp}} = 10 $ repeated experiments
Figure 5.  Left panel (a): For both models, comparison of the performance of the LEnKF-ML where the model is unknown and of the traditional LEnKF where the model is known, as a function of the observation density. Right panel (b): For both models, comparison of the performance of the LEnKF-ML where the model is unknown and of the traditional LEnKF where the model is known, as a function of the observation error standard deviation. In all cases, the ensemble size is $ {N_\mathrm{e}} = 24 $ and the RMSE statistics are accumulated over $ {N_\mathrm{exp}} = 10 $ experiments
Figure 6.  Left panel (a): Optimal tapering coefficient $ \zeta $ across a range of ensemble sizes $ {N_\mathrm{e}} $ for LEnKF-ML applied to the L96 model ($ {N_\mathrm{x}} = 40 $). Right panel (b): Optimal tapering coefficient $ \zeta $ across a range of model state space dimensions $ {N_\mathrm{x}} $ for LEnKF-ML applied to the L96 model, assuming $ {N_\mathrm{e}} = 40 $, a fixed multiplicative inflation and localization length
Figure 7.  Comparison for both models of the performance of the IEnKF-ML where the model is unknown and of the traditional IEnKF where the model is known, as a function of the time interval between updates $ \Delta t $. The absence of a data point means that at least one of the $ {N_\mathrm{e}} = 10 $ DA runs was divergent
[1]

Truong-Vinh Hoang, Sebastian Krumscheid, Hermann G. Matthies, Raúl Tempone. Machine learning-based conditional mean filter: A generalization of the ensemble Kalman filter for nonlinear data assimilation. Foundations of Data Science, 2022  doi: 10.3934/fods.2022016

[2]

Andreas Bock, Colin J. Cotter. Learning landmark geodesics using the ensemble Kalman filter. Foundations of Data Science, 2021, 3 (4) : 701-727. doi: 10.3934/fods.2021020

[3]

Junyoung Jang, Kihoon Jang, Hee-Dae Kwon, Jeehyun Lee. Feedback control of an HBV model based on ensemble kalman filter and differential evolution. Mathematical Biosciences & Engineering, 2018, 15 (3) : 667-691. doi: 10.3934/mbe.2018030

[4]

Jiangqi Wu, Linjie Wen, Jinglai Li. Resampled ensemble Kalman inversion for Bayesian parameter estimation with sequential data. Discrete and Continuous Dynamical Systems - S, 2022, 15 (4) : 837-850. doi: 10.3934/dcdss.2021045

[5]

Alexander Bibov, Heikki Haario, Antti Solonen. Stabilized BFGS approximate Kalman filter. Inverse Problems and Imaging, 2015, 9 (4) : 1003-1024. doi: 10.3934/ipi.2015.9.1003

[6]

Russell Johnson, Carmen Núñez. The Kalman-Bucy filter revisited. Discrete and Continuous Dynamical Systems, 2014, 34 (10) : 4139-4153. doi: 10.3934/dcds.2014.34.4139

[7]

Sebastian Reich, Seoleun Shin. On the consistency of ensemble transform filter formulations. Journal of Computational Dynamics, 2014, 1 (1) : 177-189. doi: 10.3934/jcd.2014.1.177

[8]

Neil K. Chada, Yuming Chen, Daniel Sanz-Alonso. Iterative ensemble Kalman methods: A unified perspective with some new variants. Foundations of Data Science, 2021, 3 (3) : 331-369. doi: 10.3934/fods.2021011

[9]

Neil K. Chada, Claudia Schillings, Simon Weissmann. On the incorporation of box-constraints for ensemble Kalman inversion. Foundations of Data Science, 2019, 1 (4) : 433-456. doi: 10.3934/fods.2019018

[10]

Håkon Hoel, Gaukhar Shaimerdenova, Raúl Tempone. Multilevel Ensemble Kalman Filtering based on a sample average of independent EnKF estimators. Foundations of Data Science, 2020, 2 (4) : 351-390. doi: 10.3934/fods.2020017

[11]

Le Yin, Ioannis Sgouralis, Vasileios Maroulas. Topological reconstruction of sub-cellular motion with Ensemble Kalman velocimetry. Foundations of Data Science, 2020, 2 (2) : 101-121. doi: 10.3934/fods.2020007

[12]

Zhiyan Ding, Qin Li, Jianfeng Lu. Ensemble Kalman Inversion for nonlinear problems: Weights, consistency, and variance bounds. Foundations of Data Science, 2021, 3 (3) : 371-411. doi: 10.3934/fods.2020018

[13]

Mojtaba F. Fathi, Ahmadreza Baghaie, Ali Bakhshinejad, Raphael H. Sacho, Roshan M. D'Souza. Time-resolved denoising using model order reduction, dynamic mode decomposition, and kalman filter and smoother. Journal of Computational Dynamics, 2020, 7 (2) : 469-487. doi: 10.3934/jcd.2020019

[14]

Wawan Hafid Syaifudin, Endah R. M. Putri. The application of model predictive control on stock portfolio optimization with prediction based on Geometric Brownian Motion-Kalman Filter. Journal of Industrial and Management Optimization, 2022, 18 (5) : 3433-3443. doi: 10.3934/jimo.2021119

[15]

Andrea Arnold, Daniela Calvetti, Erkki Somersalo. Vectorized and parallel particle filter SMC parameter estimation for stiff ODEs. Conference Publications, 2015, 2015 (special) : 75-84. doi: 10.3934/proc.2015.0075

[16]

Hamid Maarouf. Local Kalman rank condition for linear time varying systems. Mathematical Control and Related Fields, 2022, 12 (2) : 433-446. doi: 10.3934/mcrf.2021029

[17]

Geir Evensen, Javier Amezcua, Marc Bocquet, Alberto Carrassi, Alban Farchi, Alison Fowler, Pieter L. Houtekamer, Christopher K. Jones, Rafael J. de Moraes, Manuel Pulido, Christian Sampson, Femke C. Vossepoel. An international initiative of predicting the SARS-CoV-2 pandemic using ensemble data assimilation. Foundations of Data Science, 2021, 3 (3) : 413-477. doi: 10.3934/fods.2021001

[18]

Laura Martín-Fernández, Gianni Gilioli, Ettore Lanzarone, Joaquín Míguez, Sara Pasquali, Fabrizio Ruggeri, Diego P. Ruiz. A Rao-Blackwellized particle filter for joint parameter estimation and biomass tracking in a stochastic predator-prey system. Mathematical Biosciences & Engineering, 2014, 11 (3) : 573-597. doi: 10.3934/mbe.2014.11.573

[19]

Z. G. Feng, Kok Lay Teo, N. U. Ahmed, Yulin Zhao, W. Y. Yan. Optimal fusion of sensor data for Kalman filtering. Discrete and Continuous Dynamical Systems, 2006, 14 (3) : 483-503. doi: 10.3934/dcds.2006.14.483

[20]

Qiyu Jin, Ion Grama, Quansheng Liu. Convergence theorems for the Non-Local Means filter. Inverse Problems and Imaging, 2018, 12 (4) : 853-881. doi: 10.3934/ipi.2018036

 Impact Factor: 

Metrics

  • PDF downloads (503)
  • HTML views (932)
  • Cited by (1)

Other articles
by authors

[Back to Top]