Quantum Adaptive Self-Attention for Financial Rebalancing: An Empirical Study on Automated Market Makers in Decentralized Finance

ArXiv ID: 2509.16955 “View on arXiv”

Authors: Chi-Sheng Chen, Aidan Hung-Wen Tsai

Abstract

We formulate automated market maker (AMM) \emph{“rebalancing”} as a binary detection problem and study a hybrid quantum–classical self-attention block, \textbf{“Quantum Adaptive Self-Attention (QASA)”}. QASA constructs quantum queries/keys/values via variational quantum circuits (VQCs) and applies standard softmax attention over Pauli-$Z$ expectation vectors, yielding a drop-in attention module for financial time-series decision making. Using daily data for \textbf{“BTCUSDC”} over \textbf{“Jan-2024–Jan-2025”} with a 70/15/15 time-series split, we compare QASA against classical ensembles, a transformer, and pure quantum baselines under Return, Sharpe, and Max Drawdown. The \textbf{“QASA-Sequence”} variant attains the \emph{“best single-model risk-adjusted performance”} (\textbf{“13.99%”} return; \textbf{“Sharpe 1.76”}), while hybrid models average \textbf{“11.2%”} return (vs.\ 9.8% classical; 4.4% pure quantum), indicating a favorable performance–stability–cost trade-off.

Keywords: Quantum Adaptive Self-Attention (QASA), Variational Quantum Circuits (VQCs), Automated Market Maker (AMM) Rebalancing, Pauli-Z Expectation Vectors, Attention Mechanisms, Cryptocurrency

Complexity vs Empirical Score

  • Math Complexity: 8.5/10
  • Empirical Rigor: 9.0/10
  • Quadrant: Holy Grail
  • Why: The paper employs advanced mathematical formalisms including variational quantum circuits, Pauli-Z expectation measurements, and extensive feature engineering with specific encodings, indicating high mathematical density. It also demonstrates high empirical rigor with a defined backtest on real BTC/USDC data (Jan 2024–Jan 2025), a time-series split, and quantitative performance metrics (return, Sharpe, drawdown).
  flowchart TD
    A["Research Goal<br>Formulate AMM Rebalancing as Binary Detection<br>Study Quantum Adaptive Self-Attention QASA"] --> B["Data & Baselines"]
    B --> C["Key Methodology: QASA Module"]
    C --> D["Computational Process<br>Hybrid Quantum-Classical Execution"]
    D --> E["Key Findings & Outcomes"]

    subgraph B ["Data & Inputs"]
        B1["Asset: BTCUSDC<br>Period: Jan 2024 - Jan 2025"]
        B2["Split: 70/15/15 Time-Series"]
        B3["Baselines: Classical Ensembles, Transformer, Pure Quantum"]
    end

    subgraph C ["QASA Methodology"]
        C1["VQC Construction<br>Quantum Queries/Keys/Values"]
        C2["Pauli-Z Expectation Vectors"]
        C3["Softmax Attention Mechanism"]
    end

    subgraph D ["Computation"]
        D1["Training: Variational Optimization"]
        D2["Inference: Attention Scoring & Prediction"]
    end

    subgraph E ["Outcomes"]
        E1["Best Single Model: QASA-Sequence<br>Return: 13.99%, Sharpe: 1.76"]
        E2["Hybrid Avg Return: 11.2%<br>vs Classical: 9.8%, Pure Quantum: 4.4%"]
        E3["Trade-off: Favorable Performance-Stability-Cost"]
    end