TradExpert: Revolutionizing Trading with Mixture of Expert LLMs

ArXiv ID: 2411.00782 “View on arXiv”

Authors: Unknown

Abstract

The integration of Artificial Intelligence (AI) in the financial domain has opened new avenues for quantitative trading, particularly through the use of Large Language Models (LLMs). However, the challenge of effectively synthesizing insights from diverse data sources and integrating both structured and unstructured data persists. This paper presents TradeExpert, a novel framework that employs a mix of experts (MoE) approach, using four specialized LLMs, each analyzing distinct sources of financial data, including news articles, market data, alpha factors, and fundamental data. The insights of these expert LLMs are further synthesized by a General Expert LLM to make a final prediction or decision. With specific prompts, TradeExpert can be switched between the prediction mode and the ranking mode for stock movement prediction and quantitative stock trading, respectively. In addition to existing benchmarks, we also release a large-scale financial dataset to comprehensively evaluate TradeExpert’s effectiveness. Our experimental results demonstrate TradeExpert’s superior performance across all trading scenarios.

Keywords: Large Language Models (LLMs), Mix of Experts (MoE), Quantitative Trading, Quantitative Trading

Complexity vs Empirical Score

  • Math Complexity: 3.5/10
  • Empirical Rigor: 7.5/10
  • Quadrant: Street Traders
  • Why: The paper primarily focuses on engineering a complex LLM-based framework with prompting and fine-tuning, which is architecturally heavy but mathematically lightweight; however, it demonstrates high empirical rigor by releasing a large-scale dataset, reporting specific backtesting metrics (Annualized Return, Sharpe Ratio, etc.), and conducting ablation studies.
  flowchart TD
    Goal["Research Goal: Improve quantitative trading<br>using LLMs on structured/unstructured data"] --> Data["Data Inputs:<br>News, Market Data, Alpha Factors, Fundamentals"]
    Data --> Strategy["Methodology: MoE Framework<br>4 Expert LLMs + 1 General Expert"]
    Strategy --> Mode["Switch Modes via Prompting:<br>Prediction Mode or Ranking Mode"]
    Mode --> Process["Computational Process:<br>Experts analyze specific data, General Expert synthesizes"]
    Process --> Outcome["Key Findings: TradeExpert shows<br>superior performance across trading scenarios"]