Articles | Volume 27, issue 3
https://doi.org/10.5194/npg-27-373-2020
© Author(s) 2020. This work is distributed under
the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
https://doi.org/10.5194/npg-27-373-2020
© Author(s) 2020. This work is distributed under
the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Data-driven predictions of a multiscale Lorenz 96 chaotic system using machine-learning methods: reservoir computing, artificial neural network, and long short-term memory network
Ashesh Chattopadhyay
Department of Mechanical Engineering, Rice University, Houston, TX, USA
Department of Mechanical Engineering, Rice University, Houston, TX, USA
Department of Earth Environmental and Planetary Sciences, Rice University, Houston, TX, USA
Devika Subramanian
Department of Electrical and Computer Engineering, Rice University, Houston, TX, USA
Department of Computer Science, Rice University, Houston, TX, USA
Related authors
Ashesh Chattopadhyay, Mustafa Mustafa, Pedram Hassanzadeh, Eviatar Bach, and Karthik Kashinath
Geosci. Model Dev., 15, 2221–2237, https://doi.org/10.5194/gmd-15-2221-2022, https://doi.org/10.5194/gmd-15-2221-2022, 2022
Short summary
Short summary
There is growing interest in data-driven weather forecasting, i.e., to predict the weather by using a deep neural network that learns from the evolution of past atmospheric patterns. Here, we propose three components to add to the current data-driven weather forecast models to improve their performance. These components involve a feature that incorporates physics into the neural network, a method to add data assimilation, and an algorithm to use several different time intervals in the forecast.
Ashesh Chattopadhyay, Mustafa Mustafa, Pedram Hassanzadeh, Eviatar Bach, and Karthik Kashinath
Geosci. Model Dev., 15, 2221–2237, https://doi.org/10.5194/gmd-15-2221-2022, https://doi.org/10.5194/gmd-15-2221-2022, 2022
Short summary
Short summary
There is growing interest in data-driven weather forecasting, i.e., to predict the weather by using a deep neural network that learns from the evolution of past atmospheric patterns. Here, we propose three components to add to the current data-driven weather forecast models to improve their performance. These components involve a feature that incorporates physics into the neural network, a method to add data assimilation, and an algorithm to use several different time intervals in the forecast.
Related subject area
Subject: Predictability, probabilistic forecasts, data assimilation, inverse problems | Topic: Climate, atmosphere, ocean, hydrology, cryosphere, biosphere | Techniques: Big data and artificial intelligence
Selecting and weighting dynamical models using data-driven approaches
A quest for precipitation attractors in weather radar archives
Robust weather-adaptive post-processing using model output statistics random forests
Guidance on how to improve vertical covariance localization based on a 1000-member ensemble
Weather pattern dynamics over western Europe under climate change: predictability, information entropy and production
Calibrated ensemble forecasts of the height of new snow using quantile regression forests and ensemble model output statistics
Enhancing geophysical flow machine learning performance via scale separation
Training a convolutional neural network to conserve mass in data assimilation
From research to applications – examples of operational ensemble post-processing in France using machine learning
Pierre Le Bras, Florian Sévellec, Pierre Tandeo, Juan Ruiz, and Pierre Ailliot
Nonlin. Processes Geophys., 31, 303–317, https://doi.org/10.5194/npg-31-303-2024, https://doi.org/10.5194/npg-31-303-2024, 2024
Short summary
Short summary
The goal of this paper is to weight several dynamic models in order to improve the representativeness of a system. It is illustrated using a set of versions of an idealized model describing the Atlantic Meridional Overturning Circulation. The low-cost method is based on data-driven forecasts. It enables model performance to be evaluated on their dynamics. Taking into account both model performance and codependency, the derived weights outperform benchmarks in reconstructing a model distribution.
Loris Foresti, Bernat Puigdomènech Treserras, Daniele Nerini, Aitor Atencia, Marco Gabella, Ioannis V. Sideris, Urs Germann, and Isztar Zawadzki
Nonlin. Processes Geophys., 31, 259–286, https://doi.org/10.5194/npg-31-259-2024, https://doi.org/10.5194/npg-31-259-2024, 2024
Short summary
Short summary
We compared two ways of defining the phase space of low-dimensional attractors describing the evolution of radar precipitation fields. The first defines the phase space by the domain-scale statistics of precipitation fields, such as their mean, spatial and temporal correlations. The second uses principal component analysis to account for the spatial distribution of precipitation. To represent different climates, radar archives over the United States and the Swiss Alpine region were used.
Thomas Muschinski, Georg J. Mayr, Achim Zeileis, and Thorsten Simon
Nonlin. Processes Geophys., 30, 503–514, https://doi.org/10.5194/npg-30-503-2023, https://doi.org/10.5194/npg-30-503-2023, 2023
Short summary
Short summary
Statistical post-processing is necessary to generate probabilistic forecasts from physical numerical weather prediction models. To allow for more flexibility, there has been a shift in post-processing away from traditional parametric regression models towards modern machine learning methods. By fusing these two approaches, we developed model output statistics random forests, a new post-processing method that is highly flexible but at the same time also very robust and easy to interpret.
Tobias Necker, David Hinger, Philipp Johannes Griewank, Takemasa Miyoshi, and Martin Weissmann
Nonlin. Processes Geophys., 30, 13–29, https://doi.org/10.5194/npg-30-13-2023, https://doi.org/10.5194/npg-30-13-2023, 2023
Short summary
Short summary
This study investigates vertical localization based on a convection-permitting 1000-member ensemble simulation. We derive an empirical optimal localization (EOL) that minimizes sampling error in 40-member sub-sample correlations assuming 1000-member correlations as truth. The results will provide guidance for localization in convective-scale ensemble data assimilation systems.
Stéphane Vannitsem
Nonlin. Processes Geophys., 30, 1–12, https://doi.org/10.5194/npg-30-1-2023, https://doi.org/10.5194/npg-30-1-2023, 2023
Short summary
Short summary
The impact of climate change on weather pattern dynamics over the North Atlantic is explored through the lens of information theory. These tools allow the predictability of the succession of weather patterns and the irreversible nature of the dynamics to be clarified. It is shown that the predictability is increasing in the observations, while the opposite trend is found in model projections. The irreversibility displays an overall increase in time in both the observations and the model runs.
Guillaume Evin, Matthieu Lafaysse, Maxime Taillardat, and Michaël Zamo
Nonlin. Processes Geophys., 28, 467–480, https://doi.org/10.5194/npg-28-467-2021, https://doi.org/10.5194/npg-28-467-2021, 2021
Short summary
Short summary
Forecasting the height of new snow is essential for avalanche hazard surveys, road and ski resort management, tourism attractiveness, etc. Météo-France operates a probabilistic forecasting system using a numerical weather prediction system and a snowpack model. It provides better forecasts than direct diagnostics but exhibits significant biases. Post-processing methods can be applied to provide automatic forecasting products from this system.
Davide Faranda, Mathieu Vrac, Pascal Yiou, Flavio Maria Emanuele Pons, Adnane Hamid, Giulia Carella, Cedric Ngoungue Langue, Soulivanh Thao, and Valerie Gautard
Nonlin. Processes Geophys., 28, 423–443, https://doi.org/10.5194/npg-28-423-2021, https://doi.org/10.5194/npg-28-423-2021, 2021
Short summary
Short summary
Machine learning approaches are spreading rapidly in climate sciences. They are of great help in many practical situations where using the underlying equations is difficult because of the limitation in computational power. Here we use a systematic approach to investigate the limitations of the popular echo state network algorithms used to forecast the long-term behaviour of chaotic systems, such as the weather. Our results show that noise and intermittency greatly affect the performances.
Yvonne Ruckstuhl, Tijana Janjić, and Stephan Rasp
Nonlin. Processes Geophys., 28, 111–119, https://doi.org/10.5194/npg-28-111-2021, https://doi.org/10.5194/npg-28-111-2021, 2021
Short summary
Short summary
The assimilation of observations using standard algorithms can lead to a violation of physical laws (e.g. mass conservation), which is shown to have a detrimental impact on the system's forecast. We use a neural network (NN) to correct this mass violation, using training data generated from expensive algorithms that can constrain such physical properties. We found that, in an idealized set-up, the NN can match the performance of these expensive algorithms at negligible computational costs.
Maxime Taillardat and Olivier Mestre
Nonlin. Processes Geophys., 27, 329–347, https://doi.org/10.5194/npg-27-329-2020, https://doi.org/10.5194/npg-27-329-2020, 2020
Short summary
Short summary
Statistical post-processing of ensemble forecasts is now a well-known procedure in order to correct biased and misdispersed ensemble weather predictions. But practical application in European national weather services is in its infancy. Different applications of ensemble post-processing using machine learning at an industrial scale are presented. Forecast quality and value are improved compared to the raw ensemble, but several facilities have to be made to adjust to operational constraints.
Cited articles
Andersen, J. and Kuang, Z.: Moist static
energy budget of MJO-like disturbances in the atmosphere of a zonally symmetric aquaplanet,
J. Climate, 25, 2782–2804, 2012. a
Arbabi, H. and Mezic, I.: Ergodic theory,
dynamic mode decomposition, and computation of spectral properties of the Koopman operator, SIAM
J. Appl. Dynam. Syst., 16, 2096–2126, 2017. a
Bauer, P., Thorpe, A., and
Brunet, G.: The quiet revolution of numerical weather prediction, Nature, 525, 47–55, https://doi.org/10.1038/nature14956, 2015. a
Benedict, J. and Randall, D.:
Structure of the Madden–Julian oscillation in the superparameterized CAM, J. Atmos. Sci., 66,
3277–3296, 2009. a
Bishop, C.: Pattern Recognition and Machine Learning,
Springer, 2006. a
Bolton, T. and Zanna, L.: Applications of
deep learning to ocean data inference and subgrid parameterization, J. Adv. Model. Earth Sy.,
11, 376–399, 2019. a
Brenowitz, N. and Bretherton,
C.: Prognostic validation of a neural network unified physics parameterization,
Geophys. Res. Lett., 45, 6289–6298, 2018. a
Brunton, S. and Kutz, J.: Data-driven Science
and Engineering: Machine Learning, Dynamical Systems, and Control, Cambridge University Press,
2019. a
Carbonneau, R., Laframboise, K., and Vahidov, R.: Application of machine learning techniques for
supply chain demand forecasting, Eur. J. Operat. Res., 184, 1140–1154, 2008. a
Chantry,
M., Thornes, T., Palmer, T., and Düben, P.: Scale-selective precision for weather and climate
forecasting, Mon. Weather Rev., 147, 645–655, 2019. a
Chattopadhya, A.:
RC_ESN_spatio_temporal, available at: https://github.com/ashesh6810/RCESN_spatio_temporal, GitHub,
last access: 29 June 2020. a
Chattopadhyay, A., Hassanzadeh, P., and Pasha, S.: Predicting
clustered weather patterns: A test case for applications of convolutional neural networks to
spatio-temporal climate data, Sci. Rep., 10, 1–13, 2020a. a
Chattopadhyay, A., Nabizadeh, E., and Hassanzadeh, P.:
Analog forecasting of extreme-causing weather patterns using deep learning, J. Adv.
Model. Earth Sy., 12, https://doi.org/10.1029/2019MS001958,
2020b. a
Chattopadhyay, A., Subel, A., and Hassanzadeh, P.:
Data-driven super-parameterization using deep learning: Experimentation with multi-scale Lorenz 96
systems and transfer-learning, arXiv [preprint], arXiv:2002.11167, 25 February 2020c. a
Chen, K., Zhou, Y., and Dai, F.: A
LSTM-based method for stock returns prediction: A case study of China stock market, in: 2015
IEEE International Conference on Big Data, 29 October–1 November, Santa Clara, CA, USA, 2823–2824, IEEE, 2015. a
Cho, K., V.M, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk,
H., and Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical
machine translation, arXiv [preprint], arXiv:1406.1078, 2014. a
Collins, W., Rasch, P., Boville, B., Hack, J., McCaa, J.,
Williamson, D., Briegleb, B., Bitz, C., Lin, S., and Zhang, M.: The formulation and atmospheric
simulation of the Community Atmosphere Model version 3 (CAM3), J. Climate, 19, 2144–2161, 2006. a
Collins, W. J., Bellouin, N.,
Doutriaux-Boucher, M., Gedney, N., Halloran, P., Hinton, T., Hughes, J., Jones, C. D., Joshi, M.,
Liddicoat, S., Martin, G., O'Connor, F., Rae, J., Senior, C., Sitch, S., Totterdell, I.,
Wiltshire, A., and Woodward, S.: Development and evaluation of an Earth-System model – HadGEM2,
Geosci. Model Dev., 4, 1051–1075, https://doi.org/10.5194/gmd-4-1051-2011, 2011. a
Düben, P., Joven, J., Lingamneni, A., McNamara, H., De Micheli, G.,
Palem, K., and Palmer, T.: On the use of inexact, pruned hardware in atmospheric modelling,
Philos. T. Roy. Soc. A-Math., 372, 1–16,
https://doi.org/10.1098/rsta.2013.0276, 2014. a
Düben, P., Yenugula, S., Augustine, J., Palem, K.,
Schlachter, J., Enz, C., and Palmer, T.: Opportunities for energy efficient computing: A study of
inexact general purpose processors for high-performance and big-data applications, in: 2015
Design, Automation and Test in Europe Conference and Exhibition (DATE), 9–12 March, Grenoble, France, 764–769,
IEEE, 2015. a
Duraisamy,
K., Iaccarino, G., and Xiao, H.: Turbulence modeling in the age of data, Ann. Rev. Fluid Mech.,
51, 357–377, 2019. a
Epanechnikov, V.: Non-parametric estimation of a
multivariate probability density, Theor. Prob. Appl., 14, 153–158, 1969. a
Fan, H., Jiang, J., Zhang,
C., Wang, X., and Lai, Y.-C.: Long-term prediction of chaotic systems with machine learning,
Phys. Rev. Res., 2, 012080, https://doi.org/10.1103/PhysRevResearch.2.012080, 2020. a
Flato, G.: Earth system models: an overview, WIRES
Clim. Change, 2, 783–800, 2011. a
Gagne, I., John, D., Christensen, H. M., Subramanian, A. C., and Monahan, A. H.: Machine Learning
for Stochastic Parameterization: Generative Adversarial Networks in the Lorenz'96 Model,
J. Adv. Model. Earth Sy., 12, https://doi.org/10.1029/2019MS001896, 2020. a
Garcia, R., Smith, A., Kinnison, D., Cámara, Á. l., and
Murphy, D.: Modification of the gravity wave parameterization in the Whole Atmosphere Community
Climate Model: Motivation and results, J. Atmos. Sci., 74, 275–291, 2017. a
Gentine, P., Pritchard, M., Rasp, S., Reinaudi, G., and Yacalis, G.: Could machine learning break
the convection parameterization deadlock?, Geophys. Res. Lett., 45, 5742–5751, 2018. a
Giannakis, D., Ourmazd, A., Slawinska, J., and Zhao, Z.:
Spatiotemporal pattern extraction by spectral analysis of vector-valued observables, arXiv
preprint arXiv:1711.02798, 2017. a
Graves, A., Mohamed,
A., and Hinton, G.: Speech recognition with deep recurrent neural networks, in: 2013 IEEE
international Conference on Acoustics, Speech and Signal Processing, 26–31 May, Vancouver, Canada,
6645–6649, IEEE, 2013. a
Hatfield, S., Subramanian, A., Palmer, T., and Düben, P.: Improving weather forecast skill
through reduced-precision data assimilation, Mon. Weather Rev., 146, 49–62, 2018. a
Hochreiter, S. and Schmidhuber, J.:
Long short-term memory, Neural Comput., 9, 1735–1780, 1997. a
Hourdin, F., M., T., G., A., Golaz, J., Balaji, V., Duan, Q., Folini,
D., Ji, D., Klocke, D., Qian, Y., et al.: The art and science of climate model tuning,
B. Am. Meteorol. Soc., 98, 589–602, 2017. a
Jaeger, H.: Echo state network, Scholarpedia, 2, 2330,
https://doi.org/10.4249/scholarpedia.2330, revision #188245, 2007. a, b, c
Khairoutdinov, M. and Randall,
D.: A cloud resolving model as a cloud parameterization in the NCAR Community Climate System
Model: Preliminary results, Geophys. Res. Lett., 28, 3617–3620, 2001. a
Khodkar, M. and Hassanzadeh, P.:
Data-driven reduced modelling of turbulent Rayleigh–Bénard convection using DMD-enhanced
fluctuation–dissipation theorem, J. Fluid Mech., 852, https://doi.org/10.1017/jfm.2018.586, 2018. a
Khodkar,
M., Hassanzadeh, P., Nabi, S., and Grover, P.: Reduced-order modeling of fully turbulent
buoyancy-driven flows using the Green's function method, Phys. Rev. Fluids, 4,
013801, https://doi.org/10.1103/PhysRevFluids.4.013801, 2019. a
Kim, H., Eykholt, R., and
Salas, J.: Nonlinear dynamics, delay times, and embedding windows, Phys. D: Nonlin. Phenom., 127,
48–60, 1999. a
Kingma, D. and Ba, J.: Adam: A method for stochastic
optimization, arXiv [preprint], arXiv:1412.6980, 22 December 2014. a
Kooperman, G., Pritchard, M., O'Brien, T., and Timmermans,
B.: Rainfall From Resolved Rather Than Parameterized Processes Better Represents the Present-Day
and Climate Change Response of Moderate Rates in the Community Atmosphere Model,
J. Adv. Model. Earth Sy., 10, 971–988, 2018. a
Kutz, J.: Deep learning in fluid dynamics, J. Fluid Mech., 814,
1–4, 2017. a
Leyffer, S., Wild, S., Fagan, M., Snir, M., Palem, K., Yoshii, K.,
and Finkel, H.: Doing Moore with Less–Leapfrogging Moore's Law with Inexactness for
Supercomputing, arXiv [preprint], arXiv:1610.02606, 9 October 2016. a, b
Lim, S. H.,
Giorgini, L. T., Moon, W., and Wettlaufer, J.: Predicting Rare Events in Multiscale Dynamical
Systems using Machine Learning, arXiv [preprint], arXiv:1908.03771, 10 August 2019. a, b
Ling, J., Kurzawski,
A., and Templeton, J.: Reynolds averaged turbulence modelling using deep neural networks with
embedded invariance, J. Fluid Mech., 807, 155–166, 2016. a
Lorenz, E.: Predictability: A problem partly
solved, in: Predcitibility of Weather and Climate, vol. 1, 40–58, ECMWF, 1996. a
Lu, Z.,
Pathak, J., Hunt, B., Girvan, M., Brockett, R., and Ott, E.: Reservoir observers: Model-free
inference of unmeasured variables in chaotic systems, Chaos, 27, 041102, https://doi.org/10.1063/1.4979665, 2017. a, b, c
Lu, Z., Hunt, B., and Ott, E.:
Attractor reconstruction by machine learning, Chaos, 28, 061104, https://doi.org/10.1063/1.5039508, 2018. a
Lukoševičius, M. and Jaeger, H.: Reservoir computing approaches to recurrent neural
network training, Comput. Sci. Rev., 3, 127–149, 2009. a
Ma, Q., Shen, L.,
Chen, E., Tian, S., Wang, J., and Cottrell, G.: WALKING WALKing walking: Action Recognition from
Action Echoes, in: Internationa Joint Conference on Artificial Intelligence, 19–25 August, Melbourne, Australia, 2457–2463, 2017. a
McDermott, P. and Wikle, C.: Deep echo
state networks with uncertainty quantification for spatio-temporal forecasting, Environmetrics,
30, e2553, https://doi.org/10.1002/env.2553, 2019. a, b
Meng, Q., Chen, W.,
Wang, Y., Ma, Z.-M., and Liu, T.-Y.: Convergence analysis of distributed stochastic gradient
descent with shuffling, Neurocomputing, 337, 46–57, 2019. a
Mezić, I.: Spectral properties of dynamical
systems, model reduction and decompositions, Nonlin. Dynam., 41, 309–325, 2005. a
Mohan, A.,
Daniel, D., Chertkov, M., and Livescu, D.: Compressed Convolutional LSTM: An Efficient Deep
Learning framework to Model High Fidelity 3D Turbulence, arXiv [preprint], arXiv:1903.00033, 28 February 2019. a, b
Moosavi, A., Attia,
A., and Sandu, A.: A machine learning approach to adaptive covariance localization, arXiv [preprint],
arXiv:1801.00548, 2 January 2018. a
O'Gorman, P. and Dwyer, J.: Using Machine Learning
to Parameterize Moist Convection: Potential for Modeling of Climate, Climate Change, and Extreme
Events, J. Adv. Model. Earth Sy., 10, 2548–2563, 2018. a
Palem, K.: Inexactness and a future of computing,
Philos. T. Roy. Soc. A-Math., 372, 20130281, https://doi.org/10.1098/rsta.2019.0061, 2014. a, b, c
Palmer, T.: Climate forecasting: Build high-resolution
global climate models, Nature News, 515, 338–339, https://doi.org/10.1038/515338a, 2014. a, b
Pathak, J., Wikner, A., Fussell, R., Chandra, S., Hunt, B., Girvan, M.,
and Ott, E.: Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a
knowledge-based model, Chaos, 28, 041101, https://doi.org/10.1063/1.5028373, 2018b. a
Raissi, M.,
Perdikaris, P., and Karniadakis, G.: Physics-informed neural networks: A deep learning framework
for solving forward and inverse problems involving nonlinear partial differential equations,
J. Comput. Phys., 378, 686–707, 2019. a
Rasp, S., Pritchard, M., and
Gentine, P.: Deep learning to represent subgrid processes in climate models,
P. Natl. Acad. Sci. USA, 115, 9684–9689, 2018. a
Reichstein, M., Camps-Vallis, G., Stevens, B., Jung, M., Denzler, J., and Prabhat, N. C.: Deep learning and process understanding for data-driven Earth
system science, Nature, 566, 195–204, https://doi.org/10.1038/s41586-019-0912-1,
2019. a, b
Rudy, S., Kutz, J., and Brunton,
S.: Deep learning of dynamics and signal-noise decomposition with time-stepping constraints, arXiv
[preprint], arXiv:1808.02578, 7 August 2018. a, b
Salehipour, H. and Peltier, W.: Deep
learning of mixing by two atoms of stratified turbulence, J. Fluid Mech., 861, https://doi.org/10.1017/jfm.2018.980, 2019. a
Scher, S. and Messori, G.: Generalization properties of feed-forward neural networks trained on Lorenz systems, Nonlin. Processes Geophys., 26, 381–399, https://doi.org/10.5194/npg-26-381-2019, 2019. a
Schneider, T., Lan, S., Stuart, A., and Teixeira, J.: Earth
system modeling 2.0: A blueprint for models that learn from observations and targeted
high-resolution simulations, Geophys. Res. Lett., 44, 12396–12417, https://doi.org/10.1002/2017GL076101, 2017a. a, b
Schneider, T., Teixeira, J., Bretherton, C.,
Brient, F., Pressel, K., Schär, C., and Siebesma, A.: Climate goals and computing the future
of clouds, Nat. Clim. Change, 7, 3–5, https://doi.org/10.1038/nclimate3190,
2017b. a
Siegelmann, H. T. and Sontag,
E. D.: On the computational power of neural nets, in: Proceedings of the fifth annual workshop on
Computational learning theory, July, Pittsburgh, Pennsylvania, USA,
440–449, 1992. a
Stevens, B. and Bony, S.: What are climate
models missing?, Science, 340, 1053–1054, 2013. a
Sutskever, I.,
Vinyals, O., and Le, Q.: Sequence to sequence learning with neural networks, in: Advances in
Neural Information Processing Systems, 8–13 December, Montreal, Canada, 3104–3112, 2014. a
Toms, B. A., Kashinath, K.,
Yang, D., et al.: Deep Learning for Scientific Inference from Geophysical Data: The
Madden–Julian oscillation as a Test Case, arXiv [preprint], arXiv:1902.04621, 12 February 2019. a
Tu, J., Rowley,
C., Luchtenburg, D., Brunton, S., and Kutz, J.: On dynamic mode decomposition: Theory and
applications, J. Comput. Dynam., 1, 391–421, 2014. a
Watson, P.: Applying machine learning to improve
simulations of a chaotic dynamical system using empirical error correction, J. Adv. Model. Earth
Sy., 11, 1402–1417,
https://doi.org/10.1029/2018MS001597, 2019. a
Williams, M.,
Kevrekidis, I., and Rowley, C.: A data–driven approximation of the Koopman operator: Extending
dynamic mode decomposition, J. Nonlin. Sci., 25, 1307–1346, 2015. a
Wu, J. et al.: Enforcing Statistical Constraints in
Generative Adversarial Networks for Modeling Chaotic Dynamical Systems, arXiv [preprint],
arXiv:1905.06841, 13 May 2019. a, b
Xingjian, S., Chen, Z., Wang, H., Yeung, D., Wong, W., and Woo,
W.: Convolutional LSTM network: A machine learning approach for precipitation nowcasting, in:
Advances in Neural Information Processing Systems, 7–12 December, Montreal, Canada, 802–810, 2015. a
Yildiz, I., Jaeger, H., and
Kiebel, S.: Re-visiting the echo state property, Neural Networks, 35, 1–9, 2012. a
Yu, R., Zheng, S., Anandkumar,
A., and Yue, Y.: Long-term forecasting using tensor-train RNNs, arXiv [preprint], arXiv:1711.00073,
31 October 2017. a
Zhu, Y.,
Zabaras, N., Koutsourelakis, P., and Perdikaris, P.: Physics-Constrained Deep Learning for
High-dimensional Surrogate Modeling and Uncertainty Quantification without Labeled Data, arXiv
[preprint], arXiv:1901.06314, 18 January 2019. a
Zimmermann, R. and Parlitz, U.:
Observing spatio-temporal dynamics of excitable media using reservoir computing, Chaos, 28,
043118,
https://doi.org/10.1063/1.5022276, 2018. a
Short summary
The performance of three machine-learning methods for data-driven modeling of a multiscale chaotic Lorenz 96 system is examined. One of the methods is found to be able to predict the future evolution of the chaotic system well from just knowing the past observations of the large-scale component of the multiscale state vector. Potential applications to data-driven and data-assisted surrogate modeling of complex dynamical systems such as weather and climate are discussed.
The performance of three machine-learning methods for data-driven modeling of a multiscale...