Seasonal Decomposition and Trend using Hybrid STL- FNN with Application

Seasonal Decomposition and Trend using Hybrid STL- FNN with Application

Authors

  • Zena A. Sultan, Nihad S. Khalaf Aljboori

Keywords:

Decomposition; STL model; FNN; Smoother Loess; STL-FNN structure

Abstract

Most researchers in the field of time series follow one approach, which is to rely on time series models in general. Several models have emerged in engineering, economics, and health applications. These models are known as neural networks, which imitate human nerve cells. This study suggests a hybrid algorithm called STL-FNN. It combines two methods - seasonal trend decomposition STL and feed-forward neural network (FNN). The STL method analyzes the original data into three subseries: seasonality, trend, and residual. To predict each of the three sub-series, the FNN neural network uses the seasonal component and separates them. Finally, all predicted outputs are combined as a total time series product. The results indicate that the STL-FNN can check the performance of the hybrid model. It uses the relative absolute error (MAE) criterion. We analyzed actual data to test the hybrid model’s ability to predict. The model used the average monthly spending of foreign visitors in the United Kingdom from January 1986 to February 2020

References

R. B. Cl eveland, W. S. Clev eland, J. E. McRae, I. Terp enning, Stl: A seaso nal-tre nd decomp osition, Journal of Official Statistics 6 (1) (1990) 3–73.

I. Stroud, P. Xirouc hakis, Stl and extensions, Advances in Engineering Soft ware 31 (2) (2000) 83–95 .

M. Szilv ´si-Nagy , G. Mat yasi, Analysis of stl files, Mathematical and computer modelling 38 (7-9) (2003) 945–960.

M. Theo dosiou, Forecasting monthly and quarterly time series using stl decomp osition, International Journal of Forecasting 27 (4) (2011) 1178–1195.

A. Donz´e, T. Ferrere, O. Maler, Efficien t robust monitoring for stl, in: Computer Aided Verification: 25th International Conference, CAV 2013, Springer, Sain t P etersburg, Russia, 2013, pp. 264–279.

T. R. Kelley , Stl guiding the 21st century , Technology and engineering teac her 73 (4) (2013) 18.

M. Van Eijnatten, F. H. Berger, P. De Graaf, J. Koivisto, T. Forouzanfar, J. Wolff, Influence of ct parameters on stl model accuracy , Rapid Protot yping Journal 23 (4) (2017) 678–685.

H. Yin, D. Jin, Y. H. Gu , C. J. P ark, S. K. Han, S. J. Yoo, Stl-attlstm: vegetable price forecasting using stl and atten ti on mechanism-based lstm, Agricultu re 10 (12) (2020) 612.

D. Niˇckovi´c, T. Yamaguc hi, R tam t: Online robustn ess monitors from stl, in: International Symp osium on Automated Technology for Verification and Analysis, Springer, Hanoi, Vietnam, 2020, pp. 564–571.

P. Varnai, D. V. Dimarogonas, On robustness metrics for learning stl tasks, in: 2020 American Control Conference (A CC), IEEE, Denver, CO, USA, 2020, pp. 5394–5399.

D. Chen, J. Zhang, S. Jiang, Forecasting the short-term metro ridership with seasonal and trend decomp osition using loess and lstm neural net works, IEEE Access 8 (2020) 91181–91187.

A. M. Turing, The essen tial turing, Oxford Univ ersit y Press, Oxford, United Kingdom, 2004.

S. A. Billings, Nonlinear system identi fica ti on: NARMAX metho ds in the time, frequency , and spatio-temp oral domains, John Wiley & Sons, Hoboken, United States, 2013.

E. Demido va, A. Zaveri, E. Simp erl, Seman tic image-based profiling of users’ interests with neural net works, Emerging Topics in Seman tic Technologies: ISW C 2018 Satellite Ev ents 36 (201 8) 179.

W. S. McCullo ch, W. Pitts, A logical calculus of the ideas immanen t in nerv ous activit y, The bulletin of mathematical bioph ysics 5 (19 43) 115–133.

G. Bebis, M. Georgiop oulos, Feed-forw ard neural net works, Ieee P oten tials 13 (4) (1994) 27–31.

R. El dan, O. Shamir, The power of depth for feedforw ard neural net works, in: Conference on learning theory , PMLR, New York, USA, 2016, pp. 907–940.

H. F. Lui, W. R. Wolf, Construction of reduced-order models for fluid flows using deep feedforw ard neural net works, Journal of Fluid Mechanics 872 (2019) 963–994.

E. Ozanic h, P. Gerstoft, H. Niu, A feedforw ard neural net work for direction-of-arriv al estimation, The journal of the acoustical societ y of America 147 (3) (2020) 2035–2048.

Y. Chen, C. Zhang, C. Liu, Y. Wang, X. Wan, Atrial fibrillation detection using a feedforw ard neural net work, Journal of Medical and Biological En ginee ring 42 (1) (2022) 63–73.

Z. Chen, X. Li, W. Wang, Y. Li, L. Shi, Y. Li, Residual strength prediction of corro ded pip elines using multila yer perceptron and modified feedforw ard neural net work, Reliab ilit y Engineering & System Safet y 231 (2023) 108980 .

D. Konar, A. D. Sarma, S. Bhandary , S. Bhattac haryy a, A. Cangi, V. Aggarw al, A shallo w hybrid classical–quan tum spiking feedfo rward neura l net work for noise-robust image classification, Applied Soft Computin g 136 (2023) 110099.

R. J. Hyndman, G. Athanasop oulos, Forecasting: principles and practice, OTexts, Melb ourne, Australia, 2018.

R. B. Clev eland, W. S. Clev eland, J. E. McRae, I. Terp enning, Stl: A seasonal-trend decomp osition, J. Off. Stat 6 (1) (1990) 3–73.

C. Gupta, S. G. Sharma, M. G. Bansal, Implemen tation of bac k propagation algorith m (of neural net works) in vhdl, Phd. thesis, Deem ed Univ ersity, India (2007).

P. A. Idowu, C. Osa kwe, A. K. Aderonk e, E. R. Adaguno do, Prediction of sto ck mark et in nigeria using artificial neural net work, International Journal of Intelligen t Systems and Applications 4 (11) (2012) 68.

M. Cilimk ovic, Neura l net works and bac k propagation algorithm, Institute of Technology Blanc hardsto wn, Blanc hardsto wn

Road North Dublin 15 (1) (201 5) 1–12.

Downloads

Published

2023-12-30

How to Cite

Seasonal Decomposition and Trend using Hybrid STL- FNN with Application. (2023). Advances in the Theory of Nonlinear Analysis and Its Application, 7(4), 35-46. https://doi.org/10.17762/atnaa.v7.i4.280