Naghashi, Vahid; Boukadoum, Mounir; Diallo, Abdoulaye Banire
A multiscale model for multivariate time series forecasting Journal Article
In: Scientific Reports, vol. 15, no. 1, pp. 1565, 2025, ISSN: 2045-2322.
Abstract | Links | BibTeX | Tags: Data mining, Machine learning
@article{naghashi_multiscale_2025,
title = {A multiscale model for multivariate time series forecasting},
author = {Vahid Naghashi and Mounir Boukadoum and Abdoulaye Banire Diallo},
url = {https://www.nature.com/articles/s41598-024-82417-4},
doi = {10.1038/s41598-024-82417-4},
issn = {2045-2322},
year = {2025},
date = {2025-01-01},
urldate = {2025-03-06},
journal = {Scientific Reports},
volume = {15},
number = {1},
pages = {1565},
abstract = {Transformer based models for time-series forecasting have shown promising performance and during the past few years different Transformer variants have been proposed in time-series forecasting domain. However, most of the existing methods, mainly represent the time-series from a single scale, making it challenging to capture various time granularities or ignore inter-series correlations between the series which might lead to inaccurate forecasts. In this paper, we address the above mentioned shortcomings and propose a Transformer based model which integrates multi-scale patch-wise temporal modeling and channel-wise representation. In the multi-scale temporal part, the input time-series is divided into patches of different resolutions to capture temporal correlations associated with various scales. The channel-wise encoder which comes after the temporal encoder, models the relations among the input series to capture the intricate interactions between them. In our framework, we further design a multi-step linear decoder to generate the final predictions for the purpose of reducing over-fitting and noise effects. Extensive experiments on seven real world datasets indicate that our model (MultiPatchFormer) achieves state-of-the-art results by surpassing other current baseline models in terms of error metrics and shows stronger generalizability.},
keywords = {Data mining, Machine learning},
pubstate = {published},
tppubtype = {article}
}
Transformer based models for time-series forecasting have shown promising performance and during the past few years different Transformer variants have been proposed in time-series forecasting domain. However, most of the existing methods, mainly represent the time-series from a single scale, making it challenging to capture various time granularities or ignore inter-series correlations between the series which might lead to inaccurate forecasts. In this paper, we address the above mentioned shortcomings and propose a Transformer based model which integrates multi-scale patch-wise temporal modeling and channel-wise representation. In the multi-scale temporal part, the input time-series is divided into patches of different resolutions to capture temporal correlations associated with various scales. The channel-wise encoder which comes after the temporal encoder, models the relations among the input series to capture the intricate interactions between them. In our framework, we further design a multi-step linear decoder to generate the final predictions for the purpose of reducing over-fitting and noise effects. Extensive experiments on seven real world datasets indicate that our model (MultiPatchFormer) achieves state-of-the-art results by surpassing other current baseline models in terms of error metrics and shows stronger generalizability.