Transformer Based Time-Series Forecasting for Stock
ArXiv ID: 2502.09625 “View on arXiv”
Authors: Unknown
Abstract
To the naked eye, stock prices are considered chaotic, dynamic, and unpredictable. Indeed, it is one of the most difficult forecasting tasks that hundreds of millions of retail traders and professional traders around the world try to do every second even before the market opens. With recent advances in the development of machine learning and the amount of data the market generated over years, applying machine learning techniques such as deep learning neural networks is unavoidable. In this work, we modeled the task as a multivariate forecasting problem, instead of a naive autoregression problem. The multivariate analysis is done using the attention mechanism via applying a mutated version of the Transformer, “Stockformer”, which we created.
Keywords: Transformer Architecture, Attention Mechanism, Multivariate Forecasting, Deep Learning, Stock Price Prediction, Equities
Complexity vs Empirical Score
- Math Complexity: 7.0/10
- Empirical Rigor: 4.0/10
- Quadrant: Lab Rats
- Why: The paper uses advanced deep learning architecture (Transformer/attention) and multivariate time-series modeling, indicating high math complexity. However, the excerpt focuses on problem formulation, data sourcing, and conceptual design without showing actual implementation code, backtest results, or statistical metrics, placing it in the theoretical ‘Lab Rats’ quadrant.
flowchart TD
A["Research Goal<br>Forecast Stock Prices<br>Using Multivariate Analysis"] --> B["Data Collection<br>Market Data & External Factors"]
B --> C["Data Preprocessing<br>Normalization & Feature Engineering"]
C --> D["Model Architecture<br>Custom 'Stockformer' Transformer"]
D --> E["Attention Mechanism<br>Multivariate Time-Series Processing"]
E --> F["Training & Validation<br>Deep Learning Neural Networks"]
F --> G["Key Findings<br>Transformer models effectively<br>capture chaotic market patterns"]