This is a Preprint and has not been peer reviewed. The published version of this Preprint is available: https://doi.org/10.2166/wst.2024.110. This is version 1 of this Preprint.
Downloads
Authors
Abstract
In this paper, we address the critical task of 24-hour streamflow forecasting using advanced deep-learning models, with a primary focus on the Transformer architecture which has seen limited application in this specific task. We compare the performance of five different models, including Persistence, LSTM, Seq2Seq, GRU, and Transformer, across four distinct regions. The evaluation is based on three performance metrics: Nash-Sutcliffe Efficiency (NSE), Pearson’s r, and Normalized Root Mean Square Error (NRMSE). Additionally, we investigate the impact of two data extension methods: zero-padding and persistence, on the model's predictive capabilities. Our findings highlight the Transformer's superiority in capturing complex temporal dependencies and patterns in the streamflow data, outperforming all other models in terms of both accuracy and reliability. The study's insights emphasize the significance of leveraging advanced deep learning techniques, such as the Transformer, in hydrological modeling and streamflow forecasting for effective water resource management and flood prediction.
DOI
https://doi.org/10.31223/X5DM4V
Subjects
Environmental Engineering, Hydraulic Engineering
Keywords
Rainfall-Runoff modeling, Deep learning, flood forecasting, Transformers, streamflow forecasting
Dates
Published: 2023-09-12 20:05
There are no comments or no comments have been made public for this article.