Transformer neural network architectures have become increasingly used in state-of-the-art performing neural network time-series forecasting mod- els. An important component of modern transformers is an attention mechanism that provides a means for a model to learn and encode the relative dependencies of elements in a sequence. This study will explore the potential of probabilistic Signal Diffusion Mapping (SDM) as an attention mechanism specifically for forecasting financial data. This implementation of the SDM algorithm promises, with linear asymptotic complexity, to combat the challenges of lag-length distortion in financial data by including the ability to operationalize general propositions about scedasticity while calculating time-varying optimal lag-length relationships.
sheplecjs / sdm_transformer Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License