|
[1] 教育部統計處「高級中等以上學校學生就學貸款統計」, Available: http://stats.moe.gov.tw/files/important/OVERVIEW_F03.pdf [2] 行政院主計總處「105 學年大專校院延修生人數」, Available: https://www.dgbas.gov.tw/public/Data/7417174752W75YTOV0.pdf [3] 財團法人國家政策基金會「別讓延畢當了競爭力」, Available: https://www.npf.org.tw/1/15957 [4] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature, vol. 323, no. 6088, pp. 533–536, Oct. 1986. [5] G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks.” Science, vol. 313, issue 5786, pp. 504-507, 2006. [6] P. Vincent, H. Larochelle, Y. Bengio, and P.-A. Manzagol, “Extracting and Composing Robust Features with Denoising Autoencoders,” in Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103, 2008. [7] J. Han and C. Moraga, “The influence of the sigmoid function parameters on the speed of backpropagation learning,” in From Natural to Artificial Neural Computation, Springer, Berlin, Heidelberg, pp. 195–201, 1995. [8] V. Nair and G. E. Hinton, “Rectified linear units improve restricted boltzmann machines.” Proceedings of 27th International Conference on Machine Learning, pp. 807-814, 2010 [9] J. Duchi, E. Hazan, and Y. Singer, “Adaptive Subgradient Methods for Online Learning and Stochastic Optimization,” Journal of Machine Learning Research, vol. 12, no. Jul, pp. 2121–2159, 2011. [10] D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” arXiv:1412.6980 [cs], Dec. 2014. [11] T. G. Dietterich, “Ensemble Methods in Machine Learning,” in Multiple Classifier Systems, pp. 1–15, 2000. [12] Y. Freund and R. E. Schapire, “Experiments with a New Boosting Algorithm,” in Proceedings of the Thirteenth International Conference on International Conference on Machine Learning, pp. 148–1561, 1996. [13] L. Breiman, “Bagging Predictors,” Machine Learning, vol. 24, no. 2, pp. 123–140, 1996. [14] J. A. Hartigan and M. A. Wong, “Algorithm AS 136: A K-Means Clustering Algorithm,” J. R. Stat. Soc. Ser. C Appl. Stat., vol. 28, no. 1, pp. 100–108, 1979. [15] R. Kohavi, “A Study of Cross-validation and Bootstrap for Accuracy Estimation and Model Selection,” in Proceedings of the 14th International Joint Conference on Artificial Intelligence . vol. 2, pp. 1137–1143, 1995. [16] A. Y. Ng, “Preventing ‘Overfitting’ of Cross-Validation Data,” in Proceedings of the Fourteenth International Conference on Machine Learning, pp. 245–253, 1997. [17] N. Srivastava, G. E. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: a simple way to prevent neural networks from overfitting.” The Journal of Machine Learning Research, vol. 15, pp. 1929-1958, 2014. [18] 潘家偉, “Using Stacked Denoising Autoencoder for the Student Dropout Prediction ”, 2017. [19] J. Xu, K. H. Moon, and M. van der Schaar, “A Machine Learning Approach for Tracking and Predicting Student Performance in Degree Programs,” IEEE J. Sel. Top. Signal Process., vol. 11, no. 5, pp. 742–753, Aug. 2017. [20] J. Han, M. Kamber, and J. Pei, “Data Mining: Concepts and Techniques.” Morgan Kaufmann, 2000 [21] Encoding skill, Available: http://scikit-learn.org/stable/modules/classes.html#module- sklearn.preprocessing [22] M. Abadi et al., “TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems,” Mar. 2016. [23] Giuseppe Bonaccorso, “Machine Learning Algorithms: A reference guide to popular algorithms for data science and machine learning”, 2017. [24] L. Breiman, “Random Forests,” Mach. Learn., vol. 45, no. 1, pp. 5–32, Oct. 2001. [25] C. Cortes and V. Vapnik, “Support-vector networks.” Machine Learning, vol. 20, pp. 273-297, 1995. [26] 蕭文龍, 統計分析入門與應用, 2018
|