Transformers architectures for time series forecasting

Policarpi, Andrea (2022) Transformers architectures for time series forecasting. [Laurea magistrale], Università di Bologna, Corso di Studio in Artificial intelligence [LM-DM270]
Documenti full-text disponibili:
[img] Documento PDF (Thesis)
Disponibile con Licenza: Creative Commons: Attribuzione - Non commerciale - Non opere derivate 4.0 (CC BY-NC-ND 4.0)

Download (4MB)

Abstract

Time series forecasting is an important task related to countless applications, spacing from anomaly detection to healthcare problems. The ability to predict future values of a given time series is a non­trivial operation, whose complexity heavily depends on the number and the quality of data available. Historically, the problem has been addressed by statistical models and simple deep learning architectures such as CNNs and RNNs; recently many Transformer-based models have also been used, with excellent results. This thesis work aims to evaluate the performances of two transformer-based models, namely a TransformerT2V and an Informer, when applied to time series forecasting problems, and compare them with non-transformer architectures. Furthermore, a second contribution resides in the exploration of the Informer's Probsparse mechanism, and the suggestion of improvements to increase the model performances.

Abstract
Tipologia del documento
Tesi di laurea (Laurea magistrale)
Autore della tesi
Policarpi, Andrea
Relatore della tesi
Correlatore della tesi
Scuola
Corso di studio
Ordinamento Cds
DM270
Parole chiave
time series forecasting,time series,forecasting,deep learning,transformer,informer,cnn,lstm,attention,probsparse
Data di discussione della Tesi
4 Febbraio 2022
URI

Altri metadati

Statistica sui download

Gestione del documento: Visualizza il documento

^