Deep context-attentive transformer transfer learning for financial forecasting
| dc.contributor.author | Ling Feng | |
| dc.contributor.author | Ananta Sinchai | |
| dc.date.accessioned | 2026-05-08T19:17:08Z | |
| dc.date.issued | 2025-6-30 | |
| dc.description.abstract | of 0.9094. Wilcoxon signed-rank test confirms statistically significant gains in non-transfer learning scenarios at the 0.05 level. Transfer learning experiments reveal statistically significant improvements, reinforcing the feasibility of cross-market knowledge transfer. An ablation study highlights the impact of architectural refinements and rotary positional encoding, while prediction horizon analysis confirms stable forecasting performance. These results establish 2CAT as a robust financial forecasting framework adaptable to diverse market conditions. | |
| dc.identifier.doi | 10.7717/peerj-cs.2983 | |
| dc.identifier.uri | https://dspace.kmitl.ac.th/handle/123456789/15882 | |
| dc.publisher | PeerJ Computer Science | |
| dc.subject | Stock Market Forecasting Methods | |
| dc.subject | Time Series Analysis and Forecasting | |
| dc.subject | Energy Load and Power Forecasting | |
| dc.title | Deep context-attentive transformer transfer learning for financial forecasting | |
| dc.type | Article |