Introduction
Time series forecasting is an important task that finds applications in different contexts, such as predictive maintenance, medicine or finance. A time series forecasting model aims at predicting future values of a target variable given the information available at a certain time step [1]. A time series is represented using a stochastic process. The available information can be for example, measurements from weather stations for climate modeling or sensor data on mechanical processes . Deep learning models have been used to tackle the time series forecasting problem, as these models have the ability to learn complex data representation. More recently, Transformer-based architectures have been proposed as a solution to time series forecasting. Transformers are networks based on attention mechanisms and represent the state-of-the-art in sequence modeling tasks like machine translation and language understanding [2]. Variants of these network architectures have been proposed as models for time series forecasting [3, 4].
The goal of this thesis is to study recent advances in time series forecasting, with particular emphasis on the use of Transformers architectures, and to apply these models to a real problem.

Planned Activities
- Initial research on deep learning methods for time series forecasting;
- Research on Transformers and their application to time series forecasting;
- Implementation of state-of-the-art time series forecasting methods;
- Application of these models to a real problem.
Who we’re looking for
Students that are about to get their Master Degree in: computer science, computer engineering, mechatronic engineering, mathematical engineering, mathematics, physics, informatics.
Required skills:
- Proficiency in at least one programming language (Python, Lua, Matlab, C++, Java), Python is preferred;
- Basic knowledge of machine learning and Deep Learning algorithms.
Duration of this Projects: 6-8 months
Check these links before moving on
- Time Series Forecasting With Deep Learning: A Survey (https://arxiv.org/pdf/2004.13408v1.pdf)
- Attention is all you need (https://arxiv.org/pdf/1706.03762.pdf)
- Deep Transformer Models for Time Series Forecasting: The Influenza Prevalence Case (https://arxiv.org/pdf/2001.08317.pdf)
- Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting (https://arxiv.org/pdf/1907.00235.pdf)
Contact Us
Directly by email to: [email protected]
By LinkedIn: linkedin.com/in/cannavò-sonia-66a95467