|
[1] A. Gensler, J. Henze, B. SickRaabe, and N. Raabe, “Deep Learning for Solar Power Forecasting – An Approach Using Autoencoder and LSTM Neural Networks,” in 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC 2016), 2016, pp. 2858–2865. [2] M. T. Leung, A. Chen, and H. Daouk, “Forecasting Exchange Rates Using General Regression Neural networks,” Comput. Oper. Res., vol. 27, no. 11–12, pp. 1093–1110, 2000. [3] Adnyani and Subanar, “General Regression Nerve Network (GRNN) Forecasting Dollar Exchange Rate and Composite Stock Price Index (CSPI),” Factor Exacta, vol. 8, pp. 137–144, 2015. [4] R. E. Caraka, H. Yasin, and P. A, “Modeling of General Regression Neural Network (GRNN) on Data Return of Euro 50 Stock Price Index,” Gaussian, vol. 4, pp. 181–192, 2015. [5] S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory,” Neural Comput., vol. 9, no. 8, pp. 1–32, 1997. [6] K. Greff, R. K. Srivastava, J. Koutn\’\ik, B. R. Steunebrink, and J. Schmidhuber, “LSTM: Search Space Odyssey,” IEEE Trans. NEURAL NETWORKS Learn. Syst., vol. 28, no. 10, pp. 2222–2232, 2017. [7] D. Kryukov, M. Agafonova, and A. Arestova, “Comparison of Regression and Neural Network Approaches to Forecast Daily Power Consumption Kryukov,” in IFOST-2016: Power Engineering and Renewable Energy Technologies Comparison, 2016, no. 4, pp. 247–250. [8] O. Gamze, D. Omer F, and Z. Selim, “Forecasting Electricity Consumption with Neural Networks and Support Vector Regression,” in 8th International Strategic Management Conference Forecasting, 2012, vol. 58, pp. 1576–1585. [9] A. Marvuglia and A. Messineo, “Using Recurrent Artificial Neural Networks to Forecast Household Electricity Consumption,” Energy Procedia, vol. 14, pp. 45–55, 2012. [10] C. Deb, L. S. Eang, J. Yang, and M. Santamouris, “Forecasting Energy Consumption of Institutional Buildings in Singapore,” in 9th International Symposium on Heating, Ventilation and Air Conditioning (ISHVAC) and the 3rd International Conference on Building Energy and Environment (COBEE), 2015, vol. 121, pp. 1734–1740. [11] Y. Fu, Z. Li, H. Zhang, and P. Xu, “Using Support Vector Machine to Predict Next Day Electricity Load of Public Buildings with Sub-metering Devices,” 9th Int. Symp. Heating, Vent. Air Cond. 3rd Int. Conf. Build. Energy Environ., vol. 121, pp. 1016–1022, 2015. [12] S. Grubwinkler and M. Lienkamp, “Energy Prediction for EVs Using Support Vector Regression Methods Stefan,” in 7th IEEE International Conference Intelligent Systems IS’2014, 2015, vol. 322, pp. 769–780. [13] S. Ruliah and R. Rolyadely, “Prediction of Electricity Usage with a Backpropagation Approach,” J. Tek. Inform. dan Sist. Inf., vol. 3, no. 1, p. 466, 2014. [14] M. Syafruddin, L. Hakim, and D. Despa, “Linear Regression Method for Predicting Long-Term Electric Energy Needs (Case Study of Lampung Province),” J. Inform. dan Tek. Elektro, no. 1, 2014. [15] D. Niu, H. Wang, H. Chen, and Y. Liang, “The General Regression Neural Network Based on the Fruit Fly Optimization Algorithm and the Data Inconsistency Rate for Transmission Line Icing Prediction,” Energies, vol. 10, no. 2066, 2017. [16] N. Srivastava, E. Mansimov, and R. Salakhutdinov, “Unsupervised Learning of Video Representations using LSTMs,” Feb. 2015. [17] J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Gated Recurrent Neural Networks on Sequence Modeling,” 2014. [18] Y. Gao and D. Glowacka, “Deep Gate Recurrent Neural Network,” in JMLR: Workshop and Conference Proceedings 63, 2016, pp. 350–365. [19] W. Pan, “Knowledge-Based Systems A new Fruit Fly Optimization Algorithm : Taking the financial distress model as an example,” Knowledge-Based Syst., vol. 26, pp. 69–74, 2012. [20] E. Brochu, V. M. Cora, and N. De Freitas, “A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modelling and Hierachical Reinforcement Learning.” [21] P. I. Frazier, “A Tutorial on Bayesian Optimization,” no. Section 5, pp. 1–22, 2018. [22] M. Krasser, “Bayesian optimization,” 2018. [Online]. Available: http://krasserm.github.io/2018/03/21/bayesian-optimization/. [23] J. Gonzalvez, E. Lezmi, T. Roncalli, and J. Xu, “Financial Applications of Gaussian Processes and Bayesian Optimization,” pp. 1–42, 2019. [24] M. Krasser, “Gaussian processes.” [Online]. Available: http://krasserm.github.io/2018/03/19/gaussian-processes/. [25] D. F. Specht, “General Regression Neural Network ( GRNN ),” in General Regression Neural Network ( GRNN ), pp. 42–60. [26] C. Olah, “Understanding LSTM Networks,” 2015. [Online]. Available: http://colah.github.io/posts/2015-08-Understanding-LSTMs/. [27] A. Gulli and S. Pal, Deep Learning with Keras. Birmingham: Packt Publishing, 2017. [28] R. Jozefowicz, W. Zaremba, and I. Sutskever, “An Empirical Exploration of Recurrent Network Architectures Rafal,” JMLR W&CP, vol. 37, 2015. [29] J. Brownlee, “A Gentle Introduction to LSTM Autoencoders,” 2018. [Online]. Available: https://machinelearningmastery.com/lstm-autoencoders/. [30] “Skopt module.” [Online]. Available: https://scikit-optimize.github.io/. [31] Keras, “Keras: The Python Deep Learning library.” [Online]. Available: https://keras.io/. [32] Neupy, “Neupy-Neural Networks in Python.” [Online]. Available: http://neupy.com/apidocs/neupy.algorithms.rbfn.grnn.html#neupy.algorithms.rbfn.grnn.GRNN. [33] J. Brownlee, “What is the Difference Between a Batch and an Epoch in a Neural Network?,” 2018. [Online]. Available: https://machinelearningmastery.com/difference-between-a-batch-and-an-epoch/. [34] D. P. Kingma and J. L. Ba, “Adam: A Method for Stochastic Optimization,” in International Conference on Learning Representations (ICLR), 2015, pp. 1–15. [35] J. Brownlee, “Gentle Introduction to the Adam Optimization Algorithm for Deep Learning,” 2017. [Online]. Available: https://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/. [36] H. Zulkifli, “Understanding Learning Rates and How It Improves Performance in Deep Learning,” 2018. [Online]. Available: https://towardsdatascience.com/understanding-learning-rates-and-how-it-improves-performance-in-deep-learning-d0d4059c1c10. [37] L. N. Smith, “Cyclical learning rates for training neural networks,” in Proceedings - 2017 IEEE Winter Conference on Applications of Computer Vision, WACV 2017, 2017, no. April, pp. 464–472. [38] R. J. Hyndman and G. Athanasopoulos, Forecasting: principles and practice, 2nd edition. Melbourne, Australia: OTexts, 2018. [39] Python, “Pyramid ARIMA.” [Online]. Available: https://pypi.org/project/pmdarima/.
|