iTransformer: Rethinking Transformer Architecture for Enhanced Time Series Forecasting


The Transformer model, successful in natural language processing and computer vision, is now emerging in time series forecasting. However, its validity in this field is being questioned by academics. Researchers from Tsinghua University propose an improved model, the iTransformer, which treats independent time series as tokens to capture multivariate correlations. The iTransformer has shown promising results in real-world forecasting benchmarks, potentially paving the way for future advancements in Transformer-based predictors.

Read more at MarkTechPost…