Boosting the Accuracy of Stock Market Prediction via Multi-Layer Hybrid MTL Structure
ArXiv ID: 2501.09760 “View on arXiv”
Authors: Unknown
Abstract
Accurate stock market prediction provides great opportunities for informed decision-making, yet existing methods struggle with financial data’s non-linear, high-dimensional, and volatile characteristics. Advanced predictive models are needed to effectively address these complexities. This paper proposes a novel multi-layer hybrid multi-task learning (MTL) framework aimed at achieving more efficient stock market predictions. It involves a Transformer encoder to extract complex correspondences between various input features, a Bidirectional Gated Recurrent Unit (BiGRU) to capture long-term temporal relationships, and a Kolmogorov-Arnold Network (KAN) to enhance the learning process. Experimental evaluations indicate that the proposed learning structure achieves great performance, with an MAE as low as 1.078, a MAPE as low as 0.012, and an R^2 as high as 0.98, when compared with other competitive networks.
Keywords: multi-task learning, transformer encoder, Kolmogorov-Arnold Networks (KAN), time-series forecasting, deep learning, equities
Complexity vs Empirical Score
- Math Complexity: 8.5/10
- Empirical Rigor: 2.0/10
- Quadrant: Lab Rats
- Why: The paper employs advanced math including Transformer encoders, BiGRU, and Kolmogorov-Arnold Networks with complex formulations, but lacks concrete backtesting details, code, or robust data implementation, focusing instead on reported metrics without addressing real-world deployment challenges.
flowchart TD
A["Research Goal: Accurate Stock Market Prediction"] --> B["Input Data: Financial Time-Series"]
B --> C["Method: Multi-Layer Hybrid MTL Structure"]
C --> D["Transformer Encoder: Extract Feature Correlations"]
D --> E["BiGRU: Capture Temporal Dependencies"]
E --> F["KAN: Enhance Non-linear Learning"]
F --> G{"Outcomes & Metrics"}
G --> H["MAE: 1.078"]
G --> I["MAPE: 0.012"]
G --> J["R^2: 0.98"]