false

Temporal Relational Reasoning of Large Language Models for Detecting Stock Portfolio Crashes

Temporal Relational Reasoning of Large Language Models for Detecting Stock Portfolio Crashes ArXiv ID: 2410.17266 “View on arXiv” Authors: Unknown Abstract Stock portfolios are often exposed to rare consequential events (e.g., 2007 global financial crisis, 2020 COVID-19 stock market crash), as they do not have enough historical information to learn from. Large Language Models (LLMs) now present a possible tool to tackle this problem, as they can generalize across their large corpus of training data and perform zero-shot reasoning on new events, allowing them to detect possible portfolio crash events without requiring specific training data. However, detecting portfolio crashes is a complex problem that requires more than reasoning abilities. Investors need to dynamically process the impact of each new piece of information found in news articles, analyze the relational network of impacts across different events and portfolio stocks, as well as understand the temporal context between impacts across time-steps, in order to obtain the aggregated impact on the target portfolio. In this work, we propose an algorithmic framework named Temporal Relational Reasoning (TRR). It seeks to emulate the spectrum of human cognitive capabilities used for complex problem-solving, which include brainstorming, memory, attention and reasoning. Through extensive experiments, we show that TRR is able to outperform state-of-the-art techniques on detecting stock portfolio crashes, and demonstrate how each of the proposed components help to contribute to its performance through an ablation study. Additionally, we further explore the possible applications of TRR by extending it to other related complex problems, such as the detection of possible global crisis events in Macroeconomics. ...

October 7, 2024 · 2 min · Research Team

A Comparison between Financial and Gambling Markets

A Comparison between Financial and Gambling Markets ArXiv ID: 2409.13528 “View on arXiv” Authors: Unknown Abstract Financial and gambling markets are ostensibly similar and hence strategies from one could potentially be applied to the other. Financial markets have been extensively studied, resulting in numerous theorems and models, while gambling markets have received comparatively less attention and remain relatively undocumented. This study conducts a comprehensive comparison of both markets, focusing on trading rather than regulation. Five key aspects are examined: platform, product, procedure, participant and strategy. The findings reveal numerous similarities between these two markets. Financial exchanges resemble online betting platforms, such as Betfair, and some financial products, including stocks and options, share speculative traits with sports betting. We examine whether well-established models and strategies from financial markets could be applied to the gambling industry, which lacks comparable frameworks. For example, statistical arbitrage from financial markets has been effectively applied to gambling markets, particularly in peer-to-peer betting exchanges, where bettors exploit odds discrepancies for risk-free profits using quantitative models. Therefore, exploring the strategies and approaches used in both markets could lead to new opportunities for innovation and optimization in trading and betting activities. ...

September 20, 2024 · 2 min · Research Team

Evaluating Investment Risks in LATAM AI Startups: Ranking of Investment Potential and Framework for Valuation

Evaluating Investment Risks in LATAM AI Startups: Ranking of Investment Potential and Framework for Valuation ArXiv ID: 2410.03552 “View on arXiv” Authors: Unknown Abstract The growth of the tech startup ecosystem in Latin America (LATAM) is driven by innovative entrepreneurs addressing market needs across various sectors. However, these startups encounter unique challenges and risks that require specific management approaches. This paper explores a case study with the Total Addressable Market (TAM), Serviceable Available Market (SAM), and Serviceable Obtainable Market (SOM) metrics within the context of the online food delivery industry in LATAM, serving as a model for valuing startups using the Discounted Cash Flow (DCF) method. By analyzing key emerging powers such as Argentina, Colombia, Uruguay, Costa Rica, Panama, and Ecuador, the study highlights the potential and profitability of AI-driven startups in the region through the development of a ranking of emerging powers in Latin America for tech startup investment. The paper also examines the political, economic, and competitive risks faced by startups and offers strategic insights on mitigating these risks to maximize investment returns. Furthermore, the research underscores the value of diversifying investment portfolios with startups in emerging markets, emphasizing the opportunities for substantial growth and returns despite inherent risks. ...

September 17, 2024 · 2 min · Research Team

Multi-Industry Simplex 2.0 : Temporally-Evolving Probabilistic Industry Classification

Multi-Industry Simplex 2.0 : Temporally-Evolving Probabilistic Industry Classification ArXiv ID: 2407.16437 “View on arXiv” Authors: Unknown Abstract Accurate industry classification is critical for many areas of portfolio management, yet the traditional single-industry framework of the Global Industry Classification Standard (GICS) struggles to comprehensively represent risk for highly diversified multi-sector conglomerates like Amazon. Previously, we introduced the Multi-Industry Simplex (MIS), a probabilistic extension of GICS that utilizes topic modeling, a natural language processing approach. Although our initial version, MIS-1, was able to improve upon GICS by providing multi-industry representations, it relied on an overly simple architecture that required prior knowledge about the number of industries and relied on the unrealistic assumption that industries are uncorrelated and independent over time. We improve upon this model with MIS-2, which addresses three key limitations of MIS-1 : we utilize Bayesian Non-Parametrics to automatically infer the number of industries from data, we employ Markov Updating to account for industries that change over time, and we adjust for correlated and hierarchical industries allowing for both broad and niche industries (similar to GICS). Further, we provide an out-of-sample test directly comparing MIS-2 and GICS on the basis of future correlation prediction, where we find evidence that MIS-2 provides a measurable improvement over GICS. MIS-2 provides portfolio managers with a more robust tool for industry classification, empowering them to more effectively identify and manage risk, particularly around multi-sector conglomerates in a rapidly evolving market in which new industries periodically emerge. ...

July 23, 2024 · 2 min · Research Team

Low Volatility Stock Portfolio Through High Dimensional Bayesian Cointegration

Low Volatility Stock Portfolio Through High Dimensional Bayesian Cointegration ArXiv ID: 2407.10175 “View on arXiv” Authors: Unknown Abstract We employ a Bayesian modelling technique for high dimensional cointegration estimation to construct low volatility portfolios from a large number of stocks. The proposed Bayesian framework effectively identifies sparse and important cointegration relationships amongst large baskets of stocks across various asset spaces, resulting in portfolios with reduced volatility. Such cointegration relationships persist well over the out-of-sample testing time, providing practical benefits in portfolio construction and optimization. Further studies on drawdown and volatility minimization also highlight the benefits of including cointegrated portfolios as risk management instruments. ...

July 14, 2024 · 2 min · Research Team

Markowitz Meets Bellman: Knowledge-distilled Reinforcement Learning for Portfolio Management

Markowitz Meets Bellman: Knowledge-distilled Reinforcement Learning for Portfolio Management ArXiv ID: 2405.05449 “View on arXiv” Authors: Unknown Abstract Investment portfolios, central to finance, balance potential returns and risks. This paper introduces a hybrid approach combining Markowitz’s portfolio theory with reinforcement learning, utilizing knowledge distillation for training agents. In particular, our proposed method, called KDD (Knowledge Distillation DDPG), consist of two training stages: supervised and reinforcement learning stages. The trained agents optimize portfolio assembly. A comparative analysis against standard financial models and AI frameworks, using metrics like returns, the Sharpe ratio, and nine evaluation indices, reveals our model’s superiority. It notably achieves the highest yield and Sharpe ratio of 2.03, ensuring top profitability with the lowest risk in comparable return scenarios. ...

May 8, 2024 · 2 min · Research Team

Analyzing Economic Convergence Across the Americas: A Survival Analysis Approach to GDP per Capita Trajectories

Analyzing Economic Convergence Across the Americas: A Survival Analysis Approach to GDP per Capita Trajectories ArXiv ID: 2404.04282 “View on arXiv” Authors: Unknown Abstract By integrating survival analysis, machine learning algorithms, and economic interpretation, this research examines the temporal dynamics associated with attaining a 5 percent rise in purchasing power parity-adjusted GDP per capita over a period of 120 months (2013-2022). A comparative investigation reveals that DeepSurv is proficient at capturing non-linear interactions, although standard models exhibit comparable performance under certain circumstances. The weight matrix evaluates the economic ramifications of vulnerabilities, risks, and capacities. In order to meet the GDPpc objective, the findings emphasize the need of a balanced approach to risk-taking, strategic vulnerability reduction, and investment in governmental capacities and social cohesiveness. Policy guidelines promote individualized approaches that take into account the complex dynamics at play while making decisions. ...

April 3, 2024 · 2 min · Research Team

Anti-correlation network among China A-shares

Anti-correlation network among China A-shares ArXiv ID: 2404.00028 “View on arXiv” Authors: Unknown Abstract The correlation-based financial networks are studied intensively. However, previous studies ignored the importance of the anti-correlation. This paper is the first to consider the anti-correlation and positive correlation separately, and accordingly construct the weighted temporal anti-correlation and positive correlation networks among stocks listed in the Shanghai and Shenzhen stock exchanges. For both types of networks during the first 24 years of this century, fundamental topological measurements are analyzed systematically. This paper unveils some essential differences in these topological measurements between the anti-correlation and positive correlation networks. It also observes an asymmetry effect between the stock market decline and rise. The methodology proposed in this paper has the potential to reveal significant differences in the topological structure and dynamics of a complex financial system, stock behavior, investment portfolios, and risk management, offering insights that are not visible when all correlations are considered together. More importantly, this paper proposes a new direction for studying complex systems: the anti-correlation network. It is well worth reexamining previous relevant studies using this new methodology. ...

March 21, 2024 · 2 min · Research Team

Utilizing the LightGBM Algorithm for Operator User Credit Assessment Research

Utilizing the LightGBM Algorithm for Operator User Credit Assessment Research ArXiv ID: 2403.14483 “View on arXiv” Authors: Unknown Abstract Mobile Internet user credit assessment is an important way for communication operators to establish decisions and formulate measures, and it is also a guarantee for operators to obtain expected benefits. However, credit evaluation methods have long been monopolized by financial industries such as banks and credit. As supporters and providers of platform network technology and network resources, communication operators are also builders and maintainers of communication networks. Internet data improves the user’s credit evaluation strategy. This paper uses the massive data provided by communication operators to carry out research on the operator’s user credit evaluation model based on the fusion LightGBM algorithm. First, for the massive data related to user evaluation provided by operators, key features are extracted by data preprocessing and feature engineering methods, and a multi-dimensional feature set with statistical significance is constructed; then, linear regression, decision tree, LightGBM, and other machine learning algorithms build multiple basic models to find the best basic model; finally, integrates Averaging, Voting, Blending, Stacking and other integrated algorithms to refine multiple fusion models, and finally establish the most suitable fusion model for operator user evaluation. ...

March 21, 2024 · 2 min · Research Team

Optimizing Neural Networks for Bermudan Option Pricing: Convergence Acceleration, Future Exposure Evaluation and Interpolation in Counterparty Credit Risk

Optimizing Neural Networks for Bermudan Option Pricing: Convergence Acceleration, Future Exposure Evaluation and Interpolation in Counterparty Credit Risk ArXiv ID: 2402.15936 “View on arXiv” Authors: Unknown Abstract This paper presents a Monte-Carlo-based artificial neural network framework for pricing Bermudan options, offering several notable advantages. These advantages encompass the efficient static hedging of the target Bermudan option and the effective generation of exposure profiles for risk management. We also introduce a novel optimisation algorithm designed to expedite the convergence of the neural network framework proposed by Lokeshwar et al. (2022) supported by a comprehensive error convergence analysis. We conduct an extensive comparative analysis of the Present Value (PV) distribution under Markovian and no-arbitrage assumptions. We compare the proposed neural network model in conjunction with the approach initially introduced by Longstaff and Schwartz (2001) and benchmark it against the COS model, the pricing model pioneered by Fang and Oosterlee (2009), across all Bermudan exercise time points. Additionally, we evaluate exposure profiles, including Expected Exposure and Potential Future Exposure, generated by our proposed model and the Longstaff-Schwartz model, comparing them against the COS model. We also derive exposure profiles at finer non-standard grid points or risk horizons using the proposed approach, juxtaposed with the Longstaff Schwartz method with linear interpolation and benchmark against the COS method. In addition, we explore the effectiveness of various interpolation schemes within the context of the Longstaff-Schwartz method for generating exposures at finer grid horizons. ...

February 24, 2024 · 2 min · Research Team