false

Financial Information Theory

Financial Information Theory ArXiv ID: 2511.16339 “View on arXiv” Authors: Miquel Noguer i Alonso Abstract This paper introduces a comprehensive framework for Financial Information Theory by applying information-theoretic concepts such as entropy, Kullback-Leibler divergence, mutual information, normalized mutual information, and transfer entropy to financial time series. We systematically derive these measures with complete mathematical proofs, establish their theoretical properties, and propose practical algorithms for estimation. Using S&P 500 data from 2000 to 2025, we demonstrate empirical usefulness for regime detection, market efficiency testing, and portfolio construction. We show that normalized mutual information (NMI) behaves as a powerful, bounded, and interpretable measure of temporal dependence, highlighting periods of structural change such as the 2008 financial crisis and the COVID-19 shock. Our entropy-adjusted Value at Risk, information-theoretic diversification criterion, and NMI-based market efficiency test provide actionable tools for risk management and asset allocation. We interpret NMI as a quantitative diagnostic of the Efficient Market Hypothesis and demonstrate that information-theoretic methods offer superior regime detection compared to traditional autocorrelation- or volatility-based approaches. All theoretical results include rigorous proofs, and empirical findings are validated across multiple market regimes spanning 25 years of daily returns. ...

November 20, 2025 · 2 min · Research Team

Entropy corrected geometric Brownian motion

Entropy corrected geometric Brownian motion ArXiv ID: 2403.06253 “View on arXiv” Authors: Unknown Abstract The geometric Brownian motion (GBM) is widely employed for modeling stochastic processes, yet its solutions are characterized by the log-normal distribution. This comprises predictive capabilities of GBM mainly in terms of forecasting applications. Here, entropy corrections to GBM are proposed to go beyond log-normality restrictions and better account for intricacies of real systems. It is shown that GBM solutions can be effectively refined by arguing that entropy is reduced when deterministic content of considered data increases. Notable improvements over conventional GBM are observed for several cases of non-log-normal distributions, ranging from a dice roll experiment to real world data. ...

March 10, 2024 · 2 min · Research Team