false

Marketron games: Self-propelling stocks vs dumb money and metastable dynamics of the Good, Bad and Ugly markets

Marketron games: Self-propelling stocks vs dumb money and metastable dynamics of the Good, Bad and Ugly markets ArXiv ID: 2501.12676 “View on arXiv” Authors: Unknown Abstract We present a model of price formation in an inelastic market whose dynamics are partially driven by both money flows and their impact on asset prices. The money flow to the market is viewed as an investment policy of outside investors. For the price impact effect, we use an impact function that incorporates the phenomena of market inelasticity and saturation from new money (the $dumb ; money$ effect). Due to the dependence of market investors’ flows on market performance, the model implies a feedback mechanism that gives rise to nonlinear dynamics. Consequently, the market price dynamics are seen as a nonlinear diffusion of a particle (the $marketron$) in a two-dimensional space formed by the log-price $x$ and a memory variable $y$. The latter stores information about past money flows, so that the dynamics are non-Markovian in the log price $x$ alone, but Markovian in the pair $(x,y)$, bearing a strong resemblance to spiking neuron models in neuroscience. In addition to market flows, the model dynamics are partially driven by return predictors, modeled as unobservable Ornstein-Uhlenbeck processes. By using a new interpretation of predictive signals as $self$-$propulsion$ components of the price dynamics, we treat the marketron as an active particle, amenable to methods developed in the physics of active matter. We show that, depending on the choice of parameters, our model can produce a rich variety of interesting dynamic scenarios. In particular, it predicts three distinct regimes of the market, which we call the $Good$, the $Bad$, and the $Ugly$ markets. The latter regime describes a scenario of a total market collapse or, alternatively, a corporate default event, depending on whether our model is applied to the whole market or an individual stock. ...

January 22, 2025 · 3 min · Research Team

Optimal Rebate Design: Incentives, Competition and Efficiency in Auction Markets

Optimal Rebate Design: Incentives, Competition and Efficiency in Auction Markets ArXiv ID: 2501.12591 “View on arXiv” Authors: Unknown Abstract This study explores the design of an efficient rebate policy in auction markets, focusing on a continuous-time setting with competition among market participants. In this model, a stock exchange collects transaction fees from auction investors executing block trades to buy or sell a risky asset, then redistributes these fees as rebates to competing market makers submitting limit orders. Market makers influence both the price at which the asset trades and their arrival intensity in the auction. We frame this problem as a principal-multi-agent problem and provide necessary and sufficient conditions to characterize the Nash equilibrium among market makers. The exchange’s optimization problem is formulated as a high-dimensional Hamilton-Jacobi-Bellman equation with Poisson jump processes, which is solved using a verification result. To numerically compute the optimal rebate and transaction fee policies, we apply the Deep BSDE method. Our results show that optimal transaction fees and rebate structures improve market efficiency by narrowing the spread between the auction clearing price and the asset’s fundamental value, while ensuring a minimal gain for both market makers indexed on the price of the asset on a coexisting limit order book. ...

January 22, 2025 · 2 min · Research Team

Optimal vs. Naive Diversification in the Cryptocurrencies Market: The Role of Time-Varying Moments and Transaction Costs

Optimal vs. Naive Diversification in the Cryptocurrencies Market: The Role of Time-Varying Moments and Transaction Costs ArXiv ID: 2501.12841 “View on arXiv” Authors: Unknown Abstract This study investigates three central questions in portfolio optimization. First, whether time-varying moment estimators outperform conventional sample estimators in practical portfolio construction. Second, whether incorporating a turnover penalty into the optimization objective can improve out-of-sample performance. Third, what type of optimal portfolio strategies can consistently outperform the naive 1/N benchmark. Using empirical evidence from the cryptocurrencies market, this paper provides comprehensive answers to these questions. In the process, several additional findings are uncovered, offering further insights into the dynamics of portfolio construction in highly volatile asset classes. ...

January 22, 2025 · 2 min · Research Team

Optimizing Portfolio Performance through Clustering and Sharpe Ratio-Based Optimization: A Comparative Backtesting Approach

Optimizing Portfolio Performance through Clustering and Sharpe Ratio-Based Optimization: A Comparative Backtesting Approach ArXiv ID: 2501.12074 “View on arXiv” Authors: Unknown Abstract Optimizing portfolio performance is a fundamental challenge in financial modeling, requiring the integration of advanced clustering techniques and data-driven optimization strategies. This paper introduces a comparative backtesting approach that combines clustering-based portfolio segmentation and Sharpe ratio-based optimization to enhance investment decision-making. First, we segment a diverse set of financial assets into clusters based on their historical log-returns using K-Means clustering. This segmentation enables the grouping of assets with similar return characteristics, facilitating targeted portfolio construction. Next, for each cluster, we apply a Sharpe ratio-based optimization model to derive optimal weights that maximize risk-adjusted returns. Unlike traditional mean-variance optimization, this approach directly incorporates the trade-off between returns and volatility, resulting in a more balanced allocation of resources within each cluster. The proposed framework is evaluated through a backtesting study using historical data spanning multiple asset classes. Optimized portfolios for each cluster are constructed and their cumulative returns are compared over time against a traditional equal-weighted benchmark portfolio. ...

January 21, 2025 · 2 min · Research Team

Defaultable bond liquidity spread estimation: an option-based approach

Defaultable bond liquidity spread estimation: an option-based approach ArXiv ID: 2501.11427 “View on arXiv” Authors: Unknown Abstract This paper extends an option-theoretic approach to estimate liquidity spreads for corporate bonds. Inspired by Longstaff’s equity market framework and subsequent work by Koziol and Sauerbier on risk-free zero-coupon bonds, the model views liquidity as a look-back option. The model accounts for the interplay of risk-free rate volatility and credit risk. A numerical analysis highlights the impact of these factors on the liquidity spread, particularly for bonds with different maturities and credit ratings. The methodology is applied to estimate the liquidity spread for unquoted bonds, with a specific case study on the Republic of Italy’s debt, leveraging market data to calibrate model parameters and classify liquid versus illiquid emissions. This approach provides a robust tool for pricing illiquid bonds, emphasizing the importance of marketability in debt security valuation. ...

January 20, 2025 · 2 min · Research Team

Mean-Field Limits for Nearly Unstable Hawkes Processes

Mean-Field Limits for Nearly Unstable Hawkes Processes ArXiv ID: 2501.11648 “View on arXiv” Authors: Unknown Abstract In this paper, we establish general scaling limits for nearly unstable Hawkes processes in a mean-field regime by extending the method introduced by Jaisson and Rosenbaum. Under a mild asymptotic criticality condition on the self-exciting kernels ${“φ^n"}$, specifically $|φ^n|{“L^1”} \to 1$, we first show that the scaling limits of these Hawkes processes are necessarily stochastic Volterra diffusions of affine type. Moreover, we establish a propagation of chaos result for Hawkes systems with mean-field interactions, highlighting three distinct regimes for the limiting processes, which depend on the asymptotics of $n(1-|φ^n|{“L^1”})^2$. These results provide a significant generalization of the findings by Delattre, Fournier and Hoffmann. ...

January 20, 2025 · 2 min · Research Team

A statistical technique for cleaning option price data

A statistical technique for cleaning option price data ArXiv ID: 2501.11164 “View on arXiv” Authors: Unknown Abstract Recorded option pricing datasets are not always freely available. Additionally, these datasets often contain numerous prices which are either higher or lower than can reasonably be expected. Various reasons for these unexpected observations are possible, including human error in the recording of the details associated with the option in question. In order for the analyses performed on these datasets to be reliable, it is necessary to identify and remove these options from the dataset. In this paper, we list three distinct problems often found in recorded option price datasets alongside means of addressing these. The methods used are justified using sound statistical reasoning and remove option prices violating the standard assumption of no arbitrage. An attractive aspect of the proposed technique is that no option pricing model-based assumptions are used. Although the discussion is restricted to European options, the procedure is easily modified for use with exotic options as well. As a final contribution, the paper contains a link to six option pricing datasets which have already been cleaned using the proposed methods and can be freely used by researchers. ...

January 19, 2025 · 2 min · Research Team

Crossing penalised CAViaR

Crossing penalised CAViaR ArXiv ID: 2501.10564 “View on arXiv” Authors: Unknown Abstract Dynamic quantiles, or Conditional Autoregressive Value at Risk (CAViaR) models, have been extensively studied at the individual level. However, efforts to estimate multiple dynamic quantiles jointly have been limited. Existing approaches either sequentially estimate fitted quantiles or impose restrictive assumptions on the data generating process. This paper fills this gap by proposing an objective function for the joint estimation of all quantiles, introducing a crossing penalty to guide the process. Monte Carlo experiments and an empirical application on the FTSE100 validate the effectiveness of the method, offering a flexible and robust approach to modelling multiple dynamic quantiles in time-series data. ...

January 17, 2025 · 2 min · Research Team

Institutional Adoption and Correlation Dynamics: Bitcoin's Evolving Role in Financial Markets

Institutional Adoption and Correlation Dynamics: Bitcoin’s Evolving Role in Financial Markets ArXiv ID: 2501.09911 “View on arXiv” Authors: Unknown Abstract Bitcoin, widely recognized as the first cryptocurrency, has shown increasing integration with traditional financial markets, particularly major U.S. equity indices, amid accelerating institutional adoption. This study examines how Bitcoin exchange-traded funds and corporate Bitcoin holdings affect correlations with the Nasdaq 100 and the S&P 500, using rolling-window correlation, static correlation coefficients, and an event-study framework on daily data from 2018 to 2025.Correlation levels intensified following key institutional milestones, with peaks reaching 0.87 in 2024, and they vary across market regimes. These trends suggest that Bitcoin has transitioned from an alternative asset toward a more integrated financial instrument, carrying implications for portfolio diversification, risk management, and systemic stability. Future research should further investigate regulatory and macroeconomic factors shaping these evolving relationships. ...

January 17, 2025 · 2 min · Research Team

Lead Times in Flux: Analyzing Airbnb Booking Dynamics During Global Upheavals (2018-2022)

Lead Times in Flux: Analyzing Airbnb Booking Dynamics During Global Upheavals (2018-2022) ArXiv ID: 2501.10535 “View on arXiv” Authors: Unknown Abstract Short-term shifts in booking behaviors can disrupt forecasting in the travel and hospitality industry, especially during global crises. Traditional metrics like average or median lead times often overlook important distribution changes. This study introduces a normalized L1 (Manhattan) distance to assess Airbnb booking lead time divergences from 2018 to 2022, focusing on the COVID-19 pandemic across four major U.S. cities. We identify a two-phase disruption: an abrupt change at the pandemic’s onset followed by partial recovery with persistent deviations from pre-2018 patterns. Our method reveals changes in travelers’ planning horizons that standard statistics miss, highlighting the need to analyze the entire lead-time distribution for more accurate demand forecasting and pricing strategies. The normalized L1 metric provides valuable insights for tourism stakeholders navigating ongoing market volatility. ...

January 17, 2025 · 2 min · Research Team