Transfer learning model for cash-instrument prediction adopting a Transformer derivative

dc.contributor.authorLing Feng
dc.contributor.authorAnanta Sinchai
dc.date.accessioned2025-07-21T06:10:55Z
dc.date.issued2024-03-01
dc.description.abstractInvestors aiming for high market returns must accurately predict the prices of various cash instruments. However, making accurate predictions is challenging due to the complex cyclic and trending characteristic of markets, characterized by high volatility and unpredictable fluctuations. Furthermore, many studies overlook how interactions between different markets affect price movements. To address these problems, this research introduces a deep transfer-learning approach derived from the Transformer model, named the rotary-positional encoding autocorrelation Transformer (RAT). Unlike traditional methods, the RAT employs autocorrelation instead of self-attention to more effectively capture periodic features, while rotary-positional encoding preserves both the absolute and relative positioning within sequences to enhance trend understanding. Through transfer learning, the RAT model extracts deep features from a source domain and applies them to a target domain, demonstrating superior performance over LSTM, CNN-LSTM, gated recurrent units (GRUs), and Transformer models in multi-day predictions across 12 cash-instrument datasets. It achieved a substantial increase in accuracy, with a 35.83% reduction in mean squared error (MSE), a 23.95% reduction in mean absolute error (MAE), and a 32.63% increase in the coefficient of determination (R2). This study validates the RAT model's effectiveness in predicting financial instrument prices.
dc.identifier.doi10.1016/j.jksuci.2024.102000
dc.identifier.urihttps://dspace.kmitl.ac.th/handle/123456789/13359
dc.subjectMean absolute error
dc.subject.classificationStock Market Forecasting Methods
dc.titleTransfer learning model for cash-instrument prediction adopting a Transformer derivative
dc.typeArticle

Files

Collections