Skip to main content
Attention-Based Deep Learning for Runoff Forecasting: Evaluating the Temporal Fusion Transformer Against Traditional Machine Learning Models

Attention-Based Deep Learning for Runoff Forecasting: Evaluating the Temporal Fusion Transformer Against Traditional Machine Learning Models

This is a Preprint and has not been peer reviewed. This is version 1 of this Preprint.

Add a Comment

You must log in to post a comment.


Comments

There are no comments or no comments have been made public for this article.

Downloads

Download Preprint

Authors

Gunjan Kumar Mishra

Abstract

Reliable runoff forecasting is critical for
water management and flood preparedness in Nepal’s
steep, data-scarce catchments. Traditional models
such as SWAT provide process insights but demand
extensive calibration and detailed inputs often unavailable in such regions. Recent advances in attentionbased deep learning offer new opportunities to capture
temporal dependencies with improved interpretability.
This study evaluates the Temporal Fusion Transformer
(TFT) for monthly runoff prediction using 40 years
(1980–2020) of hydrometeorological data from Nepal,
benchmarked against Random Forest (RF) and Long
Short-Term Memory (LSTM) networks. Results show
that RF underestimates peaks, LSTM captures seasonality but falters under monsoon extremes, while
TFT consistently achieves superior accuracy (RMSE =
22.5, R2 = 0.88). Attention weights further reveal precipitation and antecedent runoff as dominant drivers,
reinforcing hydrological understanding. These findings highlight attention-based architectures as accurate and interpretable tools for operational flood forecasting and climate-resilient water management.

DOI

https://doi.org/10.31223/X55X7X

Subjects

Engineering, Life Sciences

Keywords

hydrology, machine learning, Rain Runoff, SWAT, Transformers

Dates

Published: 2025-11-20 11:34

Last Updated: 2025-11-20 11:34

License

No Creative Commons license