Transfer learning model for cash-instrument prediction adopting a Transformer derivative

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Investors aiming for high market returns must accurately predict the prices of various cash instruments. However, making accurate predictions is challenging due to the complex cyclic and trending characteristic of markets, characterized by high volatility and unpredictable fluctuations. Furthermore, many studies overlook how interactions between different markets affect price movements. To address these problems, this research introduces a deep transfer-learning approach derived from the Transformer model, named the rotary-positional encoding autocorrelation Transformer (RAT). Unlike traditional methods, the RAT employs autocorrelation instead of self-attention to more effectively capture periodic features, while rotary-positional encoding preserves both the absolute and relative positioning within sequences to enhance trend understanding. Through transfer learning, the RAT model extracts deep features from a source domain and applies them to a target domain, demonstrating superior performance over LSTM, CNN-LSTM, gated recurrent units (GRUs), and Transformer models in multi-day predictions across 12 cash-instrument datasets. It achieved a substantial increase in accuracy, with a 35.83% reduction in mean squared error (MSE), a 23.95% reduction in mean absolute error (MAE), and a 32.63% increase in the coefficient of determination (R2). This study validates the RAT model's effectiveness in predicting financial instrument prices.

Description

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By