Articles | Volume 31, issue 3
https://doi.org/10.5194/npg-31-409-2024
https://doi.org/10.5194/npg-31-409-2024
Research article
 | Highlight paper
 | 
19 Sep 2024
Research article | Highlight paper |  | 19 Sep 2024

Representation learning with unconditional denoising diffusion models for dynamical systems

Tobias Sebastian Finn, Lucas Disson, Alban Farchi, Marc Bocquet, and Charlotte Durand

Download

Interactive discussion

Status: closed

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on egusphere-2023-2261', Sibo Cheng, 12 Mar 2024
    • AC1: 'Reply on RC1', Tobias Finn, 24 May 2024
  • RC2: 'Comment on egusphere-2023-2261', Anonymous Referee #2, 03 Apr 2024
    • AC2: 'Reply on RC2', Tobias Finn, 24 May 2024

Peer review completion

AR: Author's response | RR: Referee report | ED: Editor decision | EF: Editorial file upload
AR by Tobias Finn on behalf of the Authors (20 Jun 2024)  Author's response   Author's tracked changes   Manuscript 
ED: Publish as is (12 Jul 2024) by Ioulia Tchiguirinskaia
AR by Tobias Finn on behalf of the Authors (16 Jul 2024)  Manuscript 
Download
Executive editor
This paper tests the ability of Artificial Intelligence methods, and more specifically Deep Learning, to eliminate the Gaussian noise that disturbs the data of a dynamic system. The authors demonstrate this using a highly chaotic model as a hard test case.
Short summary
We train neural networks as denoising diffusion models for state generation in the Lorenz 1963 system and demonstrate that they learn an internal representation of the system. We make use of this learned representation and the pre-trained model in two downstream tasks: surrogate modelling and ensemble generation. For both tasks, the diffusion model can outperform other more common approaches. Thus, we see a potential of representation learning with diffusion models for dynamical systems.