Cryptocurrency Volatility Forecasting Using Transformer-Based Deep Learning Models and On-Chain Metrics
DOI:
https://doi.org/10.32996/jefas.2024.6.1.12Keywords:
Cryptocurrency; Volatility Forecasting; Transformer; Deep Learning; Bitcoin; Ethereum; On-Chain Metrics; Attention Mechanism; Financial Time Series; Blockchain AnalyticsAbstract
Cryptocurrencies have emerged as highly dynamic digital assets, characterized by extreme price volatility and driven by both speculative behavior and network-level activities. Traditional volatility forecasting methods, including GARCH and LSTM-based models, often fall short in capturing the complex, nonlinear, and temporal dependencies inherent in crypto markets. This paper proposes a novel deep learning framework that leverages Transformer-based architectures originally designed for natural language processing to forecast short-term cryptocurrency volatility with enhanced accuracy and temporal sensitivity. By incorporating a comprehensive set of on-chain metrics (such as transaction volume, wallet activity, miner behavior, and token circulation), our model captures both market sentiment and blockchain-level dynamics. The proposed Transformer model is benchmarked against LSTM and GRU networks using Bitcoin and Ethereum datasets spanning multiple market cycles. Experimental results show that the Transformer-based model outperforms recurrent architectures in predicting both realized and implied volatility, particularly during high-turbulence periods. These findings suggest that attention mechanisms, combined with on-chain data, provide a powerful tool for managing risk and making informed decisions in the rapidly evolving digital asset ecosystem.
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Aims & scope
Call for Papers
Article Processing Charges
Publications Ethics
Google Scholar Citations
Recruitment