تعداد نشریات | 27 |
تعداد شمارهها | 566 |
تعداد مقالات | 5,821 |
تعداد مشاهده مقاله | 8,145,595 |
تعداد دریافت فایل اصل مقاله | 5,451,517 |
Comprehensive Learning Polynomial Auto-Regressive Model based on Optimization with Application of Time Series Forecasting | ||
International Journal of Industrial Electronics Control and Optimization | ||
دوره 5، شماره 1، خرداد 2022، صفحه 43-50 اصل مقاله (969.07 K) | ||
نوع مقاله: Research Articles | ||
شناسه دیجیتال (DOI): 10.22111/ieco.2021.39458.1374 | ||
نویسندگان | ||
Nastaran Darjani؛ Hesam Omranpour ![]() | ||
Babol Noshirvani University of Technology, Babol, Iran. | ||
چکیده | ||
Nowadays time series analysis is an important challenge in engineering problems. In this paper, we proposed the Comprehensive Learning Polynomial Autoregressive Model (CLPAR) predict linear and nonlinear time series. The presented model is based on the autoregressive (AR) model but developed in a polynomial aspect to make it more robust and accurate. This model predicts future values by learning the weights of the weighted sum of the polynomial combination of previous data. The learning process for the hyperparameters and properties of the model in the training phase is performed by the metaheuristic optimization method. Using this model, we can predict nonlinear time series as well as linear time series. The intended method was implemented on eight standard stationary and non-stationary large-scale real-world datasets. This method outperforms the state-of-the-art methods that use deep learning in seven time series and has better results compared to all other methods in six datasets. Experimental results show the advantage of the model accuracy over other compared methods on the various prediction tasks based on root mean square error (RMSE). | ||
کلیدواژهها | ||
Auto regressive؛ Forecasting؛ Machine learning؛ Optimization؛ Time series prediction | ||
مراجع | ||
[1] V. Ediger, S. Akar, ARIMA forecasting of primary energy demand by fuel in Turkey, Energy Policy. 35 (2007) 1701– 1708. [2] M. Khashei, F.M. Rafiei, M. Bijari, Hybrid fuzzy auto-regressive integrated moving average (FARIMAH) model for forecasting the foreign exchange markets, Int. J. Comput. Intell. Syst. 6 (2013) 954–968. [3] K. Kumar, V.K. Jain, Autoregressive integrated moving averages (ARIMA) modelling of a traffic noise time series, Appl. Acoust. 58 (1999) 283–294. [4] F.-L. Chu, A fractionally integrated autoregressive moving average approach to forecasting tourism demand, Tour. Manag. 29 (2008) 79–88. [5] H.-K. Yu, N.-Y. Kim, S.S. Kim, C. Chu, M.-K. Kee, Forecasting the number of human immunodeficiency virus infections in the Korean population using the autoregressive integrated moving average model, Osong Public Heal. Res. Perspect. 4 (2013) 358–362. [6] G. Zheng, J.L. Starck, J.G. Campbell, F. Murtagh, Multiscale transforms for filtering financial data streams, J. Comput. Intell. Financ. 7 (1999). [7] M.D. Chinn, M. LeBlanc, O. Coibion, The predictive characteristics of energy futures: Recent evidence for crude oil, natural gas, gasoline and heating oil, Nat. Gas, Gasol. Heat. Oil (October 2001). UCSC Econ. Work. Pap. (2001). [8] C. Morana, A semiparametric approach to short-term oil price forecasting, Energy Econ. 23 (2001) 325–338. [9] W.K. Buchanan, P. Hodges, J. Theis, Which way the natural gas price: an attempt to predict the direction of natural gas spot price movements using trader positions, Energy Econ. 23 (2001) 279–293. [10] M.T. Hagan, S.M. Behr, The time series approach to short term load forecasting, IEEE Trans. Power Syst. 2 (1987) 785–791. [11] G.E.P. Box, G.M. Jenkins, Time series analysis: forecasting and control, Holden-Day, 1976. https://books.google.com/books?id=1WVHAAAAMAAJ. [12] O. Renaud, J.-L. Starck, F. Murtagh, Wavelet-based combined signal filtering and prediction, IEEE Trans. Syst. Man, Cybern. Part B. 35 (2005) 1241–1251. [13] G. Zhang, B.E. Patuwo, M.Y. Hu, Forecasting with artificial neural networks: The state of the art, Int. J. Forecast. 14 (1998) 35–62. [14] T.-S. Quah, B. Srinivasan, Improving returns on stock investment through neural network selection, Expert Syst. Appl. 17 (1999) 295–301. [15] L.R. Rabiner, A tutorial on hidden Markov models and selected applications in speech recognition, Proc. IEEE. 77 (1989) 257–286. [16] J. Roman, A. Jameel, Backpropagation and recurrent neural networks in financial analysis of multiple stock market returns, in: Proc. HICSS-29 29th Hawaii Int. Conf. Syst. Sci., 1996: pp. 454–460. [17] S. A. Ghoreishi, and H. Khaloozadeh, "Application of Covariance Matrix Adaptation-Evolution Strategy to Optimal Portfolio," International Journal of Industrial Electronics, Control and Optimization, Vol. 2, No. 2, pp. 81-90, 2019. [18] A. Setare, O. Hesam, M. Homayun, Application of a fuzzy method for predicting based on high-order time series, in: 2014 Iran. Conf. Intell. Syst., 2014: pp. 1–6. [19] H.-K. Yu, Weighted fuzzy time series models for TAIEX forecasting, Phys. A Stat. Mech. Its Appl. 349 (2005) 609– 624. [20] S.-M. Chen, Forecasting enrollments based on fuzzy time series, Fuzzy Sets Syst. 81 (1996) 311–319. [21] K. Huarng, Heuristic models of fuzzy time series for forecasting, Fuzzy Sets Syst. 123 (2001) 369–386. [22] J.L. Ticknor, A Bayesian regularized artificial neural network for stock market forecasting, Expert Syst. Appl. 40 (2013) 5501–5506. [23] L. Wang, Y. Zeng, T. Chen, Back propagation neural network with adaptive differential evolution algorithm for time series forecasting, Expert Syst. Appl. 42 (2015) 855– 863. [24] T.C. Jo, The effect of virtual term generation on the neural-based approaches to time series prediction, in: 2003 4th Int. Conf. Control Autom. Proc., 2003: pp. 516–520. [25] A.B. Geva, ScaleNet-multiscale neural-network architecture for time series prediction, IEEE Trans. Neural Networks. 9 (1998) 1471–1482.
[26] P. Liu, J. Liu, K. Wu, CNN-FCM: System modeling promotes stability of deep learning in time series prediction, Knowledge-Based Syst. 203 (2020) 106081. doi:10.1016/j.knosys.2020.106081. [27] K. Wu, J. Liu, P. Liu, S. Yang, Time Series Prediction Using Sparse Autoencoder and High-order Fuzzy Cognitive Maps, IEEE Trans. Fuzzy Syst. (2019). doi:10.1109/TFUZZ.2019.2956904. [28] J.G. Carvalho Jr, C.T. Costa Jr, Identification method for fuzzy forecasting models of time series, Appl. Soft Comput. 50 (2017) 166–182. [29] O.C. Yolcu, F. Alpaslan, Prediction of TAIEX based on hybrid fuzzy time series model with single optimization process, Appl. Soft Comput. 66 (2018) 18–33. [30] J.-S. Jang, ANFIS: adaptive-network-based fuzzy inference system, IEEE Trans. Syst. Man. Cybern. 23 (1993) 665– 685. [31] S. Yang, J. Liu, Time-series forecasting based on high-order fuzzy cognitive maps and wavelet transform, IEEE Trans. Fuzzy Syst. 26 (2018) 3391–3402. [32] E. Bas, E. Egrioglu, C.H. Aladag, U. Yolcu, Fuzzy-time-series network used to forecast linear and nonlinear time series, Appl. Intell. 43 (2015) 343–355. [33] H. Akaike, Fitting autoregressive models for prediction, Ann. Inst. Stat. Math. 21 (1969) 243–247. [34] S. Mirjalili, S.M. Mirjalili, A. Lewis, Grey Wolf Optimizer, Adv. Eng. Softw. 69 (2014) 46–61. doi:10.1016/j.advengsoft.2013.12.007. [35] W. Lu, J. Yang, X. Liu, W. Pedrycz, The modeling and prediction of time series based on synergy of high-order fuzzy cognitive map and fuzzy c-means clustering, Knowledge-Based Syst. 70 (2014) 242–255. [36] S. Soltani, On the use of the wavelet decomposition for time series prediction, Neurocomputing. 48 (2002) 267–277. [37] H.S. Lopes, W.R. Weinert, EGIPSYS: an enhanced gene expression programming approach for symbolic regression problems, Int. J. Appl. Math. Comput. Sci. 14 (2004) 375– 384. [38] C. Ferreira, Gene expression programming: mathematical modeling by an artificial intelligence, Springer, 2006. [39] H. Cao, L. Kang, Y. Chen, J. Yu, Evolutionary modeling of systems of ordinary differential equations with genetic programming, Genet. Program. Evolvable Mach. 1 (2000) 309–337. [40] Y. Peng, C. Yuan, X. Qin, J. Huang, Y. Shi, An improved gene expression programming approach for symbolic regression problems, Neurocomputing. 137 (2014) 293– 301. [41] S.T.A. Niaki, S. Hoseinzade, Forecasting S&P 500 index using artificial neural networks and design of experiments, J. Ind. Eng. Int. 9 (2013) 1. [42] R. Majhi, G. Panda, G. Sahoo, A. Panda, A. Choubey, Prediction of S&P 500 and DJIA stock indices using particle swarm optimization technique, in: 2008 IEEE Congr. Evol. Comput. (IEEE World Congr. Comput. Intell., 2008: pp. 1276–1282. [43] R. Tsaih, Y. Hsu, C.C. Lai, Forecasting S&P 500 stock index futures with a hybrid AI system, Decis. Support Syst. 23 (1998) 161–174. [44] M. Martens, Measuring and forecasting S&P 500 index-futures volatility using high-frequency data, J. Futur. Mark. Futur. Options, Other Deriv. Prod. 22 (2002) 497– 518. [45] E. Hajizadeh, A. Seifi, M.H.F. Zarandi, I.B. Turksen, A hybrid modeling approach for forecasting the volatility of S&P 500 index return, Expert Syst. Appl. 39 (2012) 431– 436. [46] S.A. Hamid, Z. Iqbal, Using neural networks for forecasting volatility of S&P 500 Index futures prices, J. Bus. Res. 57 (2004) 1116–1125. [47] Yahoo, GSPC historical prices j S&P 500 stock, (n.d.). https://finance.yahoo.com/quote/%5EGSPC/history?p=%5 EGSPC (accessed August 16, 2019). [48] Monthly milk production: pounds per cow. Jan 62 - Dec 75, (n.d.). https://datamarket.com/data/set/22ox/monthly-milk-produc tion-poundsper-cow-jan-62-dec-75#!ds=22ox&display=lin e (accessed August 16, 2019). [49] Monthly closings of the Dow-Jones industrial index, Aug. 1968 - Aug. 1981, (n.d.). https://datamarket.com/data/set/22v9/monthlyclosings-of-t he-dow-jones-industrial-index-aug-1968-aug-%0A1981#!d s=22v9&display=line (accessed August 16, 2019). [50] Monthly critical radio frequencies in Washington, D.C., May 1934 - April 1954, (n.d.). https://datamarket.com/data/set/22u2/monthlycritical-radio -frequencies-in-washington-dc-may-1934-april-1954-these frequencies-reflect-the-highest-radio-frequency-that-can-be -used-forbroadcasting#!ds=22u2&display=line. [51] Co2 (ppm) mauna loa, 1965-1980, (n.d.). https://datamarket.com/data/set/22v1/co2-ppm-mauna-loa1965-%0A1980#!ds=22v1&display=line (accessed August 16, 2019). [52] Monthly Lake Erie levels 1921 - 1970, (n.d.). https://datamarket.com/data/set/22pw/monthly-lake-erie-le vels-1921-%0A1970#!ds=22pw&display=line (accessed August 16, 2019). [53] M.C. Mackey, L. Glass, Oscillation and chaos in physiological control systems, Science (80-. ). 197 (1977) 287–289. [54] H.J. Song, C.Y. Miao, Z.Q. Shen, W. Roel, D.H. Maja, C. Francky, Design of fuzzy cognitive maps using neural networks for predicting chaotic time series, Neural Networks. 23 (2010) 1264–1275. [55] C.-F. Juang, Y.-W. Tsao, A self-evolving interval type-2 fuzzy neural network with online structure and parameter learning, IEEE Trans. Fuzzy Syst. 16 (2008) 1411–1424. [56] B.-T. Zhang, P. Ohm, H. Mühlenbein, Evolutionary induction of sparse neural trees, Evol. Comput. 5 (1997) 213–236. [57] T. Hastie, R. Tibshirani, J. Friedman, Model assessment and selection, in: Elem. Stat. Learn., Springer, 2009: pp. 219– 259. [58] Chicco, Davide, Matthijs J. Warrens, and Giuseppe Jurman. "The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation." PeerJ Computer Science 7 (2021): e623. | ||
آمار تعداد مشاهده مقاله: 92 تعداد دریافت فایل اصل مقاله: 81 |