false

Optimal Signal Extraction from Order Flow: A Matched Filter Perspective on Normalization and Market Microstructure

Optimal Signal Extraction from Order Flow: A Matched Filter Perspective on Normalization and Market Microstructure ArXiv ID: 2512.18648 “View on arXiv” Authors: Sungwoo Kang Abstract We demonstrate that the choice of normalization for order flow intensity is fundamental to signal extraction in finance, not merely a technical detail. Through theoretical modeling, Monte Carlo simulation, and empirical validation using Korean market data, we prove that market capitalization normalization acts as a ``matched filter’’ for informed trading signals, achieving 1.32–1.97$\times$ higher correlation with future returns compared to traditional trading value normalization. The key insight is that informed traders scale positions by firm value (market capitalization), while noise traders respond to daily liquidity (trading volume), creating heteroskedastic corruption when normalizing by trading volume. By reframing the normalization problem using signal processing theory, we show that dividing order flow by market capitalization preserves the information signal while traditional volume normalization multiplies the signal by inverse turnover – a highly volatile quantity. Our theoretical predictions are robust across parameter specifications and validated by empirical evidence showing 482% improvement in explanatory power. These findings have immediate implications for high-frequency trading algorithms, risk factor construction, and information-based trading strategies. ...

December 21, 2025 · 2 min · Research Team

Increase Alpha: Performance and Risk of an AI-Driven Trading Framework

Increase Alpha: Performance and Risk of an AI-Driven Trading Framework ArXiv ID: 2509.16707 “View on arXiv” Authors: Sid Ghatak, Arman Khaledian, Navid Parvini, Nariman Khaledian Abstract There are inefficiencies in financial markets, with unexploited patterns in price, volume, and cross-sectional relationships. While many approaches use large-scale transformers, we take a domain-focused path: feed-forward and recurrent networks with curated features to capture subtle regularities in noisy financial data. This smaller-footprint design is computationally lean and reliable under low signal-to-noise, crucial for daily production at scale. At Increase Alpha, we built a deep-learning framework that maps over 800 U.S. equities into daily directional signals with minimal computational overhead. The purpose of this paper is twofold. First, we outline the general overview of the predictive model without disclosing its core underlying concepts. Second, we evaluate its real-time performance through transparent, industry standard metrics. Forecast accuracy is benchmarked against both naive baselines and macro indicators. The performance outcomes are summarized via cumulative returns, annualized Sharpe ratio, and maximum drawdown. The best portfolio combination using our signals provides a low-risk, continuous stream of returns with a Sharpe ratio of more than 2.5, maximum drawdown of around 3%, and a near-zero correlation with the S&P 500 market benchmark. We also compare the model’s performance through different market regimes, such as the recent volatile movements of the US equity market in the beginning of 2025. Our analysis showcases the robustness of the model and significantly stable performance during these volatile periods. Collectively, these findings show that market inefficiencies can be systematically harvested with modest computational overhead if the right variables are considered. This report will emphasize the potential of traditional deep learning frameworks for generating an AI-driven edge in the financial market. ...

September 20, 2025 · 3 min · Research Team

A new measure of risk using Fourier analysis

A new measure of risk using Fourier analysis ArXiv ID: 2408.10279 “View on arXiv” Authors: Unknown Abstract We use Fourier analysis to access risk in financial products. With it we analyze price changes of e.g. stocks. Via Fourier analysis we scrutinize quantitatively whether the frequency of change is higher than a change in (conserved) company value would allow. If it is the case, it would be a clear indicator of speculation and with it risk. The entire methods or better its application is fairly new. However, there were severe flaws in previous attempts; making the results (not the method) doubtful. We corrected all these mistakes by e.g. using Fourier transformation instead of discrete Fourier analysis. Our analysis is reliable in the entire frequency band, even for fre-quency of 1/1d or higher if the prices are noted accordingly. For the stocks scrutinized we found that the price of stocks changes disproportionally within one week which clearly indicates spec-ulation. It would be an interesting extension to apply the method to crypto currencies as these currencies have no conserved value which makes normal considerations of volatility difficult. ...

August 18, 2024 · 2 min · Research Team

Momentum Turning Points

Momentum Turning Points ArXiv ID: ssrn-3489539 “View on arXiv” Authors: Unknown Abstract Turning points are the Achilles’ heel of time-series momentum portfolios. Slow signals fail to react quickly to changes in trend while fast signals are often fa Keywords: time-series momentum, portfolio optimization, trend following, signal processing, Quantitative Equity Complexity vs Empirical Score Math Complexity: 7.0/10 Empirical Rigor: 8.0/10 Quadrant: Holy Grail Why: The paper employs a formal model to analyze momentum signals and derive analytical results, indicating moderate-to-high mathematical complexity, while its empirical analysis uses 50+ years of U.S. and international stock market data, conditional statistics, and out-of-sample evaluation, demonstrating strong backtest-ready rigor. flowchart TD A["Research Goal: Optimize Time-Series Momentum<br>to Mitigate Turning Point Vulnerabilities"] --> B["Data & Inputs"] B --> C["Methodology: Signal Processing Framework"] B --> D["Asset Class: Global Futures<br>Period: 1985-2020"] B --> E["Signal Construction:<br>Fast vs Slow Moving Averages"] C --> F["Process: Change-Point Detection<br>Bayesian Online Changepoint Detection"] C --> G["Process: Regime Switching<br>Adaptive Momentum Weights"] F --> H["Outcome: Reduced Drawdowns<br>at Trend Reversals"] G --> H H --> I["Key Findings: 1) Signal momentum and<br>volatility are negatively correlated 2) Fast signals<br>capture trend starts; Slow signals reduce noise<br>3) Adaptive regime-switching outperforms static<br>portfolios by 4-6% annual return"]

December 5, 2019 · 1 min · Research Team