Master Thesis Defense by Sophia Wilson
Title: Quantifying the Reduction in Carbon Footprint of Physics-Informed Machine Learning
Abstract:
Machine learning (ML) models often rely on large datasets and over-parameterized architectures, resulting in high energy consumption and significant greenhouse gas emissions. This thesis explores how physics-informed machine learning (PIML) can reduce energy usage while maintaining or improving predictive performance.
PIML integrates domain knowledge such as physical laws or principles into ML models to enhance their accuracy and physical consistency. This work explores PIML at the component level using periodic activation functions, periodic padding, customized output layers, and physics-informed loss functions.
At the architectural level, we investigate Flow Matching (FM) for spatiotemporal forecasting, motivated by its time-continuous and operator-learning properties that align with the evolution of physical systems. We propose a novel application of FM, implemented directly in low-resolution data space to avoid the training cost of an autoencoder. The FM framework is evaluated on a 2D incompressible shear flow simulation governed by the Navier–Stokes equations. Our model demonstrates strong one-step forecasting performance, stable rollout behaviour, and moderate energy consumption compared to baselines. It is, however, inherently limited by its fixed encoding scheme and low-resolution input.
Weather forecasting and climate modelling are identified as particularly promising domains for PIML, given their reliance on physical laws, abundance of data, and high computational demands. We show that ML models incorporating physics-based inductive biases can effectively capture the complex, non-linear dynamics of shear flow.
This work contributes to the growing field of energy-efficient ML by demonstrating how physically grounded design choices can lead to more energy-efficient models. We hope it encourages further research into PIML methods for weather and climate applications, and for the broader development of ML models with lower carbon footprints.
Supervisor: Jens Hesselbjerg Christensen and Raghavendra Selvan (Sustainable ML, DIKU)
Censor: Peter L. Langen (Aarhus Universitet)