|
[1] 教育部統計處「99-101 學年度大專校院畢業生 就業薪資巨量分析」 [2] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature, vol. 323, no. 6088, pp. 533–536, 1986. [3] J. Duchi, E. Hazan, and Y. Singer, “Adaptive Subgradient Methods for Online Learning and Stochastic Optimization,” Journal of Machine Learning Research, vol. 12, no. Jul, pp. 2121–2159, 2011. [4] D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” Proceedings of the 27th International Conference on Machine Learning, pp. 807-814, 2010. [5] E. Frank and M. Hall, “A simple approach to ordinal classification,” Proceedings of European Conference on Machine Learning, pp. 145–156, 2001. [6] Z. Niu, M. Zhou, L. Wang, X. Gao and G. Hua, “Ordinal Regression with Multiple Output CNN for Age Estimation,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4920–4928, 2016. [7] G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks,” Science, vol. 313, issue 5786, pp. 504-507, 2006. [8] P. Vincent, H. Larochelle, Y. Bengio, and P. Manzagol, “Extracting and Composing Robust Features with Denoising Autoencoders,” Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103, 2008. [9] P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. Manzagol, “Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion,” Journal of Machine Learning Research, vol. 11, pp. 3371–3408, 2010. [10] A. Y. Ng, “Preventing ‘Overfitting’ of Cross-Validation Data,” Proceedings of the Fourteenth International Conference on Machine Learning, pp. 245–253, 1997. [11] N. Srivastava, G. E. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: a simple way to prevent neural networks from overfitting,” Journal of Machine Learning Research, vol. 15, pp. 1929-1958, 2014. [12] L. Breiman, “Bagging Predictors,” Machine Learning, vol. 24, no. 2, pp. 123–140, 1996. [13] J. Y. Kuo, H. T. Chung, P. F. Wang, and B. Lei, “Building Student Course Performance Prediction Model Based on Deep Learning,” Journal of Information Science and engineering, 2019. [14] J. Y. Kuo, C. W. Pan and B. Lei, “Using Stacked Denoising Autoencoder for the Student Dropout Prediction,” Proceedings of the 2017 IEEE International Symposium on Multimedia, pp. 483–488, 2017. [15] P. Prabu and Bendangnuksung, “Students’ Performance Prediction using Deep Neural Networks,” International Journal of Applied Engineering Research, vol. 13, pp. 1171–1176, 2018. [16] E. A. Amrieh, T. Hamtini, and I. Aljarah, “Preprocessing and analyzing educational data set using X-API for improving student's performance,” Proceedings of the 2015 IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies, pp. 1-5, 2015. [17] Students' Academic Performance Dataset, Available: https://www.kaggle.com/aljarah/xAPI-Edu-Data. [18] Kaggle: Your Home for Data Science, Available: https://www.kaggle.com/. [19] B.-H. Kim, E. Vizitei, and V. Ganapathi, “GritNet: Student performance prediction with deep learning,” Proceedings of the 11th International Conference on Educational Data Mining, pp. 625–629, 2018. [20] Udacity: Learn the Latest Tech Skills; Advance Your Career, Available: https://www.udacity.com/. [21] A. Graves and J. Schmidhuber, “Framewise phoneme classification with bidirectional LSTM networks,” Proceedings of International Joint Conference on Neural Networks, pp. 23–43, 2005. [22] J. D. Keeler, D. E. Rumelhart, and W. K. Leow, “Integrated segmentation and recognition of hand-printed numerals,” Proceedings of Advances in Neural Information Processing Systems, pp. 557–563, 1991. [23] J. Han, M. Kamber, and J. Pei, “Data Mining: Concepts and Techniques,” Morgan Kaufmann, 2000. [24] M. Abadi et al., “TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems,” Mar. 2016. [25] F. Pedregosa et al., "Scikit-learn: Machine learning in Python," the Journal of Machine Learning Research, vol. 12, pp. 2825-2830, 2011. [26] S. Van der Walt, S. Chris Colbert, and G. Varoquaux, “The NumPy array: A structure for efficient numerical computation,” Computing in Science and Engineering, 2011. [27] W. McKinney, pandas: a python data analysis library, Available: http://pandas.sourceforge.net [28] L. Gaudette and N. Japkowicz, “Evaluation methods for ordinal classification,” Proceedings of the 22nd Canadian Conference on Artificial Intelligence, Kelowna, CA, pp. 207–210, 2009. [29] S. Baccianella, A. Esuli, and F. Sebastiani, “Evaluation measures for ordinal regression,” Proceedings of the Ninth International Conference on Intelligent Systems Design and Applications, IEEE, pp. 283–287, 2009. [30] J. Shlens, “A Tutorial on Principle Component Analysis,” Available: http://www.snl.salk.edu/∼shlens/pca.pdf, 2009. [31] Amjad Abu Saa, “Educational Data Mining & Students’ Performance Prediction,” International Journal of Advanced Computer Science and Applications, vol. 7, pp. 212-220, 2016.
|