Entropy corrected geometric Brownian motion

ArXiv ID: 2403.06253 “View on arXiv”

Authors: Unknown

Abstract

The geometric Brownian motion (GBM) is widely employed for modeling stochastic processes, yet its solutions are characterized by the log-normal distribution. This comprises predictive capabilities of GBM mainly in terms of forecasting applications. Here, entropy corrections to GBM are proposed to go beyond log-normality restrictions and better account for intricacies of real systems. It is shown that GBM solutions can be effectively refined by arguing that entropy is reduced when deterministic content of considered data increases. Notable improvements over conventional GBM are observed for several cases of non-log-normal distributions, ranging from a dice roll experiment to real world data.

Keywords: Geometric Brownian Motion (GBM), Entropy, Stochastic Processes, Log-normality, Forecasting, Equities

Complexity vs Empirical Score

  • Math Complexity: 7.5/10
  • Empirical Rigor: 3.5/10
  • Quadrant: Lab Rats
  • Why: The paper introduces a method that requires deriving entropy metrics and simulating Monte Carlo trajectories, indicating significant mathematical density. However, the empirical validation relies on conceptual examples (dice rolls) and qualitative visualizations rather than statistical metrics, backtesting, or robust datasets, limiting its backtest-readiness.
  flowchart TD
    A["Research Goal:<br>Model stochastic processes<br>beyond log-normality"] --> B["Methodology:<br>Entropy Correction to GBM"]
    B --> C["Inputs:<br>Dice Rolls &<br>Equities Data"]
    C --> D["Process:<br>Refine GBM by reducing<br>entropy with increased data"]
    D --> E["Outcome:<br>Effective predictions for<br>non-log-normal distributions"]