false

SoK: Stablecoins for Digital Transformation -- Design, Metrics, and Application with Real World Asset Tokenization as a Case Study

SoK: Stablecoins for Digital Transformation – Design, Metrics, and Application with Real World Asset Tokenization as a Case Study ArXiv ID: 2508.02403 “View on arXiv” Authors: Luyao Zhang Abstract Stablecoins have become a foundational component of the digital asset ecosystem, with their market capitalization exceeding 230 billion USD as of May 2025. As fiat-referenced and programmable assets, stablecoins provide low-latency, globally interoperable infrastructure for payments, decentralized finance, DeFi, and tokenized commerce. Their accelerated adoption has prompted extensive regulatory engagement, exemplified by the European Union’s Markets in Crypto-assets Regulation, MiCA, the US Guiding and Establishing National Innovation for US Stablecoins Act, GENIUS Act, and Hong Kong’s Stablecoins Bill. Despite this momentum, academic research remains fragmented across economics, law, and computer science, lacking a unified framework for design, evaluation, and application. This study addresses that gap through a multi-method research design. First, it synthesizes cross-disciplinary literature to construct a taxonomy of stablecoin systems based on custodial structure, stabilization mechanism, and governance. Second, it develops a performance evaluation framework tailored to diverse stakeholder needs, supported by an open-source benchmarking pipeline to ensure transparency and reproducibility. Third, a case study on Real World Asset tokenization illustrates how stablecoins operate as programmable monetary infrastructure in cross-border digital systems. By integrating conceptual theory with empirical tools, the paper contributes: a unified taxonomy for stablecoin design; a stakeholder-oriented performance evaluation framework; an empirical case linking stablecoins to sectoral transformation; and reproducible methods and datasets to inform future research. These contributions support the development of trusted, inclusive, and transparent digital monetary infrastructure. ...

August 4, 2025 · 2 min · Research Team

Tokenize Everything, But Can You Sell It? RWA Liquidity Challenges and the Road Ahead

Tokenize Everything, But Can You Sell It? RWA Liquidity Challenges and the Road Ahead ArXiv ID: 2508.11651 “View on arXiv” Authors: Rischan Mafrur Abstract The tokenization of real-world assets (RWAs) promises to transform financial markets by enabling fractional ownership, global accessibility, and programmable settlement of traditionally illiquid assets such as real estate, private credit, and government bonds. While technical progress has been rapid, with over $25 billion in tokenized RWAs brought on-chain as of 2025, liquidity remains a critical bottleneck. This paper investigates the gap between tokenization and tradability, drawing on recent academic research and market data from platforms such as RWA.xyz. We document that most RWA tokens exhibit low trading volumes, long holding periods, and limited investor participation, despite their potential for 24/7 global markets. Through case studies of tokenized real estate, private credit, and tokenized treasury funds, we present empirical liquidity observations that reveal low transfer activity, limited active address counts, and minimal secondary trading for most tokenized asset classes. Next, we categorize the structural barriers to liquidity, including regulatory gating, custodial concentration, whitelisting, valuation opacity, and lack of decentralized trading venues. Finally, we propose actionable pathways to improve liquidity, ranging from hybrid market structures and collateral-based liquidity to transparency enhancements and compliance innovation. Our findings contribute to the growing discourse on digital asset market microstructure and highlight that realizing the liquidity potential of RWAs requires coordinated progress across legal, technical, and institutional domains. ...

August 3, 2025 · 2 min · Research Team

Generative AI for End-to-End Limit Order Book Modelling: A Token-Level Autoregressive Generative Model of Message Flow Using a Deep State Space Network

Generative AI for End-to-End Limit Order Book Modelling: A Token-Level Autoregressive Generative Model of Message Flow Using a Deep State Space Network ArXiv ID: 2309.00638 “View on arXiv” Authors: Unknown Abstract Developing a generative model of realistic order flow in financial markets is a challenging open problem, with numerous applications for market participants. Addressing this, we propose the first end-to-end autoregressive generative model that generates tokenized limit order book (LOB) messages. These messages are interpreted by a Jax-LOB simulator, which updates the LOB state. To handle long sequences efficiently, the model employs simplified structured state-space layers to process sequences of order book states and tokenized messages. Using LOBSTER data of NASDAQ equity LOBs, we develop a custom tokenizer for message data, converting groups of successive digits to tokens, similar to tokenization in large language models. Out-of-sample results show promising performance in approximating the data distribution, as evidenced by low model perplexity. Furthermore, the mid-price returns calculated from the generated order flow exhibit a significant correlation with the data, indicating impressive conditional forecast performance. Due to the granularity of generated data, and the accuracy of the model, it offers new application areas for future work beyond forecasting, e.g. acting as a world model in high-frequency financial reinforcement learning applications. Overall, our results invite the use and extension of the model in the direction of autoregressive large financial models for the generation of high-frequency financial data and we commit to open-sourcing our code to facilitate future research. ...

August 23, 2023 · 2 min · Research Team

Decentralized Token Economy Theory (DeTEcT)

Decentralized Token Economy Theory (DeTEcT) ArXiv ID: 2309.12330 “View on arXiv” Authors: Unknown Abstract This paper presents a pioneering approach for simulation of economic activity, policy implementation, and pricing of goods in token economies. The paper proposes a formal analysis framework for wealth distribution analysis and simulation of interactions between economic participants in an economy. Using this framework, we define a mechanism for identifying prices that achieve the desired wealth distribution according to some metric, and stability of economic dynamics. The motivation to study tokenomics theory is the increasing use of tokenization, specifically in financial infrastructures, where designing token economies is in the forefront. Tokenomics theory establishes a quantitative framework for wealth distribution amongst economic participants and implements the algorithmic regulatory controls mechanism that reacts to changes in economic conditions. In our framework, we introduce a concept of tokenomic taxonomy where agents in the economy are categorized into agent types and interactions between them. This novel approach is motivated by having a generalized model of the macroeconomy with controls being implemented through interactions and policies. The existence of such controls allows us to measure and readjust the wealth dynamics in the economy to suit the desired objectives. ...

August 15, 2023 · 2 min · Research Team