TLOB: A Novel Transformer Model with Dual Attention for Price Trend Prediction with Limit Order Book Data

ArXiv ID: 2502.15757 “View on arXiv”

Authors: Unknown

Abstract

Price Trend Prediction (PTP) based on Limit Order Book (LOB) data is a fundamental challenge in financial markets. Despite advances in deep learning, existing models fail to generalize across different market conditions and assets. Surprisingly, by adapting a simple MLP-based architecture to LOB, we show that we surpass SoTA performance; thus, challenging the necessity of complex architectures. Unlike past work that shows robustness issues, we propose TLOB, a transformer-based model that uses a dual attention mechanism to capture spatial and temporal dependencies in LOB data. This allows it to adaptively focus on the market microstructure, making it particularly effective for longer-horizon predictions and volatile market conditions. We also introduce a new labeling method that improves on previous ones, removing the horizon bias. We evaluate TLOB’s effectiveness across four horizons, using the established FI-2010 benchmark, a NASDAQ and a Bitcoin dataset. TLOB outperforms SoTA methods in every dataset and horizon. Additionally, we empirically show how stock price predictability has declined over time, -6.68 in F1-score, highlighting the growing market efficiency. Predictability must be considered in relation to transaction costs, so we experimented with defining trends using an average spread, reflecting the primary transaction cost. The resulting performance deterioration underscores the complexity of translating trend classification into profitable trading strategies. We argue that our work provides new insights into the evolving landscape of stock price trend prediction and sets a strong foundation for future advancements in financial AI. We release the code at https://github.com/LeonardoBerti00/TLOB.

Keywords: Price Trend Prediction (PTP), Limit Order Book (LOB), transformer architecture, dual attention mechanism, Stocks

Complexity vs Empirical Score

  • Math Complexity: 6.0/10
  • Empirical Rigor: 7.5/10
  • Quadrant: Holy Grail
  • Why: The paper introduces a novel transformer architecture with dual attention mechanisms, indicating moderate-to-high mathematical sophistication. It demonstrates strong empirical rigor by evaluating on multiple real-world datasets (FI-2010, NASDAQ, Bitcoin), providing specific performance metrics (F1-score improvements), and conducting ablation studies, with code released for reproducibility.
  flowchart TD
    A["Research Goal: Price Trend Prediction<br>from Limit Order Book Data"] --> B["Methodology: TLOB Model<br>Dual Attention Mechanism"]
    B --> C["Input Data: FI-2010, NASDAQ, Bitcoin<br>LOB Data with New Labeling"]
    C --> D["Computational Process:<br>Spatial & Temporal Attention"]
    D --> E["Key Findings: SoTA Performance<br>Across All Horizons & Datasets"]
    E --> F["Additional Insight:<br>Declining Predictability<br>vs Transaction Costs"]