A transformer neural network for predicting near-surface temperature

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

A transformer neural network for predicting near-surface temperature. / Alerskans, Emy; Nyborg, Joachim; Birk, Morten; Kaas, Eigil.

I: Meteorological Applications, Bind 29, Nr. 5, 2098, 09.2022.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Alerskans, E, Nyborg, J, Birk, M & Kaas, E 2022, 'A transformer neural network for predicting near-surface temperature', Meteorological Applications, bind 29, nr. 5, 2098. https://doi.org/10.1002/met.2098

APA

Alerskans, E., Nyborg, J., Birk, M., & Kaas, E. (2022). A transformer neural network for predicting near-surface temperature. Meteorological Applications, 29(5), [2098]. https://doi.org/10.1002/met.2098

Vancouver

Alerskans E, Nyborg J, Birk M, Kaas E. A transformer neural network for predicting near-surface temperature. Meteorological Applications. 2022 sep.;29(5). 2098. https://doi.org/10.1002/met.2098

Author

Alerskans, Emy ; Nyborg, Joachim ; Birk, Morten ; Kaas, Eigil. / A transformer neural network for predicting near-surface temperature. I: Meteorological Applications. 2022 ; Bind 29, Nr. 5.

Bibtex

@article{ad77919e83cf4e618fc1181274d9aa9b,
title = "A transformer neural network for predicting near-surface temperature",
abstract = "A new method based on the Transformer model is proposed for post-processing of numerical weather prediction (NWP) forecasts of 2 m air temperature. The Transformer is a machine learning (ML) model based on self-attention, which extracts information about which inputs are most important for the prediction. It is trained using time series input from NWP variables and crowd-sourced 2 m air temperature observations from more than 1000 private weather stations (PWSs). The performance of the new post-processing model is evaluated using both observational data from PWSs and completely independent observations from the Danish Meteorological Institute (DMI) network of surface synoptic observations (SYNOP) stations. The performance of the Transformer model is compared against the raw NWP forecast, as well as against two benchmark post-processing models; a linear regression (LR) model and a neural network (NN). The results evaluated using PWS observations show an improvement in the 2 m temperature forecasts with respect to both bias and standard deviation (STD) for all three post-processing models, with the Transformer model showing the largest improvement. The raw NWP forecast, LR, NN and Transformer model have a bias and STD of 0.34 and 1.96 degrees C, 0.03 and 1.63 degrees C, 0.10 and 1.53 degrees C and 0.02 and 1.13 degrees C, respectively. The corresponding results using DMI SYNOP stations also show improved forecasts, where the Transformer model performs better than both the raw NWP forecast and the two benchmark models. However, a dependence on distance to the coast and cold temperatures is observed.",
keywords = "machine learning, NWP, post-processing, private weather stations, NUMERICAL WEATHER FORECASTS, MODEL OUTPUT STATISTICS, ENSEMBLE FORECASTS, KALMAN FILTER, WIND-SPEED, MOS, SYSTEM, TERRAIN, REGION",
author = "Emy Alerskans and Joachim Nyborg and Morten Birk and Eigil Kaas",
year = "2022",
month = sep,
doi = "10.1002/met.2098",
language = "English",
volume = "29",
journal = "Meteorological Applications",
issn = "1350-4827",
publisher = "JohnWiley & Sons Ltd",
number = "5",

}

RIS

TY - JOUR

T1 - A transformer neural network for predicting near-surface temperature

AU - Alerskans, Emy

AU - Nyborg, Joachim

AU - Birk, Morten

AU - Kaas, Eigil

PY - 2022/9

Y1 - 2022/9

N2 - A new method based on the Transformer model is proposed for post-processing of numerical weather prediction (NWP) forecasts of 2 m air temperature. The Transformer is a machine learning (ML) model based on self-attention, which extracts information about which inputs are most important for the prediction. It is trained using time series input from NWP variables and crowd-sourced 2 m air temperature observations from more than 1000 private weather stations (PWSs). The performance of the new post-processing model is evaluated using both observational data from PWSs and completely independent observations from the Danish Meteorological Institute (DMI) network of surface synoptic observations (SYNOP) stations. The performance of the Transformer model is compared against the raw NWP forecast, as well as against two benchmark post-processing models; a linear regression (LR) model and a neural network (NN). The results evaluated using PWS observations show an improvement in the 2 m temperature forecasts with respect to both bias and standard deviation (STD) for all three post-processing models, with the Transformer model showing the largest improvement. The raw NWP forecast, LR, NN and Transformer model have a bias and STD of 0.34 and 1.96 degrees C, 0.03 and 1.63 degrees C, 0.10 and 1.53 degrees C and 0.02 and 1.13 degrees C, respectively. The corresponding results using DMI SYNOP stations also show improved forecasts, where the Transformer model performs better than both the raw NWP forecast and the two benchmark models. However, a dependence on distance to the coast and cold temperatures is observed.

AB - A new method based on the Transformer model is proposed for post-processing of numerical weather prediction (NWP) forecasts of 2 m air temperature. The Transformer is a machine learning (ML) model based on self-attention, which extracts information about which inputs are most important for the prediction. It is trained using time series input from NWP variables and crowd-sourced 2 m air temperature observations from more than 1000 private weather stations (PWSs). The performance of the new post-processing model is evaluated using both observational data from PWSs and completely independent observations from the Danish Meteorological Institute (DMI) network of surface synoptic observations (SYNOP) stations. The performance of the Transformer model is compared against the raw NWP forecast, as well as against two benchmark post-processing models; a linear regression (LR) model and a neural network (NN). The results evaluated using PWS observations show an improvement in the 2 m temperature forecasts with respect to both bias and standard deviation (STD) for all three post-processing models, with the Transformer model showing the largest improvement. The raw NWP forecast, LR, NN and Transformer model have a bias and STD of 0.34 and 1.96 degrees C, 0.03 and 1.63 degrees C, 0.10 and 1.53 degrees C and 0.02 and 1.13 degrees C, respectively. The corresponding results using DMI SYNOP stations also show improved forecasts, where the Transformer model performs better than both the raw NWP forecast and the two benchmark models. However, a dependence on distance to the coast and cold temperatures is observed.

KW - machine learning

KW - NWP

KW - post-processing

KW - private weather stations

KW - NUMERICAL WEATHER FORECASTS

KW - MODEL OUTPUT STATISTICS

KW - ENSEMBLE FORECASTS

KW - KALMAN FILTER

KW - WIND-SPEED

KW - MOS

KW - SYSTEM

KW - TERRAIN

KW - REGION

U2 - 10.1002/met.2098

DO - 10.1002/met.2098

M3 - Journal article

VL - 29

JO - Meteorological Applications

JF - Meteorological Applications

SN - 1350-4827

IS - 5

M1 - 2098

ER -

ID: 323614609