Skip to content

Commit 80f7640

Browse files
authored
Merge pull request #109 from johannag126/master
update news, publications, team
2 parents 788cccd + bb251fd commit 80f7640

File tree

14 files changed

+104
-3
lines changed

14 files changed

+104
-3
lines changed

content/blog/machinelearning.md

Lines changed: 14 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ heroSubHeading: ''
88
heroBackground: '/images/retrosupply-jLwVAUtLOAQ-unsplash.jpeg'
99
---
1010

11-
Machine learning methodologies, particularly neural networks, are computer algorithms inspired by the structure and function of the human brain. These algorithms are capable of learning from data, identifying patterns, and making predictions or decisions without being explicitly programmed. Neural networks consist of interconnected nodes organized into layers, including input, hidden, and output layers. During training, they adjust the strength of connections between nodes, called weights, based on examples from a dataset, allowing them to learn complex patterns and relationships in the data. Neural networks are widely used in various applications, including image recognition, natural language processing, and climate modeling.
11+
Machine learning methodologies, particularly neural networks, are computer algorithms inspired by the structure and function of the human brain. These algorithms are capable of learning from data, identifying patterns, and making predictions or decisions without being explicitly programmed. Neural networks consist of interconnected nodes organized into layers, including input, hidden, and output layers (see figure below). During training, they adjust the strength of connections between nodes, called weights, based on examples from a dataset, allowing them to learn complex patterns and relationships in the data. Neural networks are widely used in various applications, including image recognition, natural language processing, and climate modeling.
1212

1313

1414
<center>
@@ -18,17 +18,28 @@ Machine learning methodologies, particularly neural networks, are computer algor
1818

1919
<h3 style="text-align: center;">Machine learning for climate modeling</h3>
2020

21-
Climate models often face challenges in representing complex processes of large-scale simulations. Machine learning presents innovative approaches to confront these obstacles. It provides new methodologies to learn missing model physics and model errors directly from data. Moreover, machine learning holds the potential to act as a feasible substitute for emulating the complete dynamics of models, thus offering an alternative to traditional climate modeling approaches. The next three paragraphs describe three applications of machine learning for climate modeling that are pursued within the M<sup>2</sup>LInES project.
21+
Climate models often face challenges in representing complex processes and managing the computational demands of large-scale simulations. Machine learning presents innovative approaches to confront these obstacles. It provides new methodologies to learn missing model physics and model errors directly from data. Moreover, machine learning holds the potential to act as a feasible substitute for emulating the complete dynamics of models, thus offering an alternative to traditional climate modeling approaches. The next three paragraphs describe three applications of machine learning for climate modeling that are pursued within the M²LInES project.
2222

2323

2424
<h3 style="text-align: center;"> Learning missing physics (“parameterization learning”)</h3>
2525

2626
One of the key applications of machine learning in climate modeling is in parameterization learning. Parameterizations are used in climate models to represent subgrid-scale processes that are unresolved at the model's grid scale. Machine learning algorithms can be trained on high-resolution simulations to learn these parameterizations directly from data, enabling more accurate representation of complex physical processes such as clouds, precipitation, and turbulence.
2727

28+
There are various approaches considered in M²LInES to learn parameterizations of ocean mesoscale eddies.
29+
30+
The data used to generate training data for mesoscale parameterizations includes idealized simulations in MOM6 ocean model (in projects of Everard and Balwada), idealized simulations in MITgcm ocean model (Zanna & Bolton 2020), idealized simulations in two-layer QG ocean model (Ross et al.. 2023, Perezhogin, Zanna & Fernandez-Granda 2023) and coupled climate models (Guillaumin & Zanna 2021, in a new project of Perezhogin).
31+
32+
Multiple approaches to represent parameterization, as a function of input coarse-grained variables, are considered. These include equation-discovery models (Zanna & Bolton 2020, Perezhogin et al., 2024), small local fully connected neural networks (in three projects of Perezhogin, Everard, and Balwada), convolutional neural networks (CNN, Guillaumin & Zanna 2021, Zhang et al., 2023, Zhang et al., 2024), and generative models (Perezhogin, Zanna & Fernandez-Granda 2023).
33+
34+
In terms of parameterized physics there is also a difference. In works of Ross et al., 2023 and Perezhogin, Zanna, Fernandez-Granda 2023, both momentum and buoyancy subgrid forcings are combined into a potential vorticity forcing. In Perezhogin et al., 2024, the momentum fluxes are parameterized. In a project of Balwada, buoyancy subgrid forcing is parameterized. Finally, in a project of Everard, buoyancy and momentum effects are combined into a single parameterization using the Eliassen-Palm flux and thickness-weighted averaging.
35+
36+
Five parameterizations (equation discovery, neural networks of Perezhogin, Everard and Balwada, and CNN) are implemented and tested in ocean model MOM6 in various configurations: idealized Double Gyre and NeverWorld2 and realistic global ocean model OM4 at eddy-permitting resolution (¼-degree). Improvement in representation of the kinetic and available potential energy, and the mean flow was shown in idealized configurations (Perezhogin et al., 2024). In the global ocean model, the mesoscale parameterization reduces local biases such as North Atlantic cold bias, and enhances the systematic effect of mesoscale eddies such as northward heat transport and ocean restratification (i.e. interior cooling).
37+
2838
<h3 style="text-align: center;">Learning model error</h3>
2939

3040
Machine learning algorithms can also be used to understand and correct biases that are due to the combined error of physics and numerics. One way to learn this combined error is to use analysis increments as a training dataset. Analysis increments represent the adjustments made to a model to bring it closer to observations during the data assimilation process. The information contained in analysis increments allows therefore for the development of correction schemes that improve the reliability and accuracy of model predictions.
3141

3242
<h3 style="text-align: center;">Emulation of the full model dynamics</h3>
3343

34-
Another important application of machine learning in climate modeling is the development of emulators. Climate model emulators are surrogate models that mimic the behavior of complex climate models. By capturing the essential features and relationships within the original models, emulators provide a computationally efficient alternative for exploring climate model outputs.
44+
Another important application of machine learning in climate modeling is the development of emulators. Climate model emulators are surrogate models that mimic the behavior of complex climate models. These emulators are trained on data generated by climate models to approximate their responses to different inputs. By capturing the essential features and relationships within the original models, emulators provide a computationally efficient alternative for exploring climate model outputs.
45+
Samudra (Dheeshjith et al., 2024) is a global ocean emulator that is skillful at emulating the contemporary ocean, building on our earlier regional surface emulators for climate change (e.g., Subel and Zanna, 2024; Dheeshjith et al.., 2024). Without further modification, Samudra could be used in studies requiring large ensembles (e.g., extreme events) or to enhance and accelerate operational applications (e.g., data assimilation). Samudra is more than a proof of concept for affordable emulation of expensive ocean circulation models and could be used off-the-shelf for many applications. It will allow us to accelerate climate modeling and research.

content/news/2503Brettin.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
---
2+
date: 2025-03-04T09:29:16+10:00
3+
title: "Learning Propagators for Sea Surface Height Forecasts Using Koopman Autoencoders"
4+
heroHeading: ''
5+
heroSubHeading: 'Learning Propagators for Sea Surface Height Forecasts Using Koopman Autoencoders'
6+
heroBackground: ''
7+
thumbnail: 'images/news/Brettin2024.png'
8+
images: ['images/news/2Brettin2024.png']
9+
link: 'https://doi.org/10.1029/2024GL112835'
10+
---
11+
Sea surface height forecasts are influenced by many uncertainties. Traditional statistical methods help make predictions but often rely on assumptions that don’t fully capture the climate system's complexity. In this [paper](https://doi.org/10.1029/2024GL112835), **Andrew Brettin** and co-authors develop a machine learning model that learns a simplified representation of the climate system, **improving sea surface height predictions**. Their approach **outperforms methods that separate data compression and prediction**. Additionally, the model highlights key regions where better sea level representation can enhance regional forecasts.

content/news/2503Connolly.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
---
2+
date: 2025-03-04T09:29:16+10:00
3+
title: "Deep Learning Turbulence Closures Generalize Best With Physics-based Methods"
4+
heroHeading: ''
5+
heroSubHeading: 'Deep Learning Turbulence Closures Generalize Best With Physics-based Methods'
6+
heroBackground: ''
7+
thumbnail: 'images/news/2503Connolly.gif'
8+
images: ['images/news/2503Connolly.gif']
9+
link: 'https://doi.org/10.22541/essoar.173869578.80400701/v1'
10+
---
11+
Representing atmospheric turbulence in climate models requires estimating unresolved small-scale forces. This [study]('https://doi.org/10.22541/essoar.173869578.80400701/v1), led by **Alex Connolly**, uses a deep neural network (DNN) to improve turbulence modeling by predicting these forces in large-eddy simulations (LES) of the atmospheric boundary layer. The DNN is trained on high-resolution data and tested across different conditions. Results show that **models using physics-based scaling perform better than those relying only on statistical normalization**. Embedding physical symmetries into the model further improves accuracy. This research **highlights the importance of physics-informed machine learning for improving turbulence representation in Earth system models for more reliable climate projections**.

content/publications/_index.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,18 @@ You can also check all our publications on our **[Google Scholar profile](https:
1313

1414

1515
<img src="/images/newlogo.png" style="width: 1.5vw; height: 1.5hw; vertical-align: middle;" alt="DOI icon"> M²LInES funded research
16+
### 2025
17+
<div style="display: flex; align-items: center;">
18+
<div style="width: 100px; height: 100px; overflow: hidden; margin-right: 10px;">
19+
<img src="/images/news/2503Connolly.gif" style="width: 100px; height: 100px;">
20+
</div>
21+
<p>
22+
<img src="/images/newlogo.png" style="width: 1.5vw; height: 1.5hw; vertical-align: middle;" alt="DOI icon">
23+
<strong>Alex Connolly, Yu Cheng, Robin Walters, Rui Wang, Rose Yu, Pierre Gentine</strong><br>
24+
<a href="https://doi.org/10.22541/essoar.173869578.80400701/v1" target="_blank"><strong>Deep Learning Turbulence Closures Generalize Best With Physics-based Methods</strong></a><br>
25+
<i>Essoar</i> <strong>DOI</strong>:10.22541/essoar.173869578.80400701/v1
26+
</p>
27+
</div>
1628

1729
### 2024
1830
<div style="display: flex; align-items: center;">

content/team/LinusVogt.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
title: "Linus Vogt"
3+
draft: false
4+
image: "/images/team/Linus.png"
5+
jobtitle: "Affiliate"
6+
promoted: true
7+
weight: 59
8+
Website: https://linusvogt.github.io/
9+
Position: Ocean tracer uptake
10+
tags: [Ocean, Coupled Physics]
11+
---
12+
13+
14+
NYU

content/team/MattPudig.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
title: "Matt Pudig"
3+
draft: false
4+
image: "/images/team/MattP.jpg"
5+
jobtitle: "Affiliate"
6+
promoted: true
7+
weight: 57
8+
Website: https://mpudig.github.io/
9+
Position: Geophysical turbulence
10+
tags: [Ocean, Climate Model Development]
11+
---
12+
13+
14+
NYU

content/team/MatthieuBlanke.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
title: "Matthieu Blanke"
3+
draft: false
4+
image: "/images/team/MatthieuB.jpg"
5+
jobtitle: "Affiliate"
6+
promoted: true
7+
weight: 56
8+
Website: https://mb-29.github.io/
9+
Position:
10+
tags: [Atmosphere, Machine Learning, Data Assimilation]
11+
---
12+
13+
14+
NYU

content/team/QiLiu.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
title: "Qi Liu"
3+
draft: false
4+
image: "/images/team/Qi-Liu.jpeg"
5+
jobtitle: "Undergraduate"
6+
promoted: true
7+
weight: 58
8+
Website:
9+
Position:
10+
tags: [Atmosphere, Machine Learning]
11+
---
12+
13+
14+
NYU
2.5 MB
Loading

static/images/news/Brettin2024.png

423 KB
Loading

0 commit comments

Comments
 (0)