|
[1]Selvaraj, S. K., Raj, A., Rishikesh Mahadevan, R., Chadha, U., & Paramasivam, V. (2022). A review on machine learning models in injection molding machines. Advances in Materials Science and Engineering, 2022(1), 1949061. [2]Thiriez, A., & Gutowski, T. (2006, May). An environmental analysis of injection molding. In Proceedings of the 2006 IEEE International Symposium on Electronics and the Environment, 2006. (pp. 195-200). IEEE. [3]Inasaki, I. (1998). Application of acoustic emission sensor for monitoring machining processes. Ultrasonics, 36(1-5), 273-281. [4]Liu, J., Zhong, L., Wickramasuriya, J., & Vasudevan, V. (2009, September). User evaluation of lightweight user authentication with a single tri-axis accelerometer. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services (pp. 1-10). [5]Carleo, G., Cirac, I., Cranmer, K., Daudet, L., Schuld, M., Tishby, N., ... & Zdeborová, L. (2019). Machine learning and the physical sciences. Reviews of Modern Physics, 91(4), 045002. [6]Krenker, A., Bešter, J., & Kos, A. (2011). Introduction to the artificial neural networks. Artificial Neural Networks: Methodological Advances and Biomedical Applications. InTech, 1-18. [7]Alasadi, S. A., & Bhaya, W. S. (2017). Review of data preprocessing techniques in data mining. Journal of Engineering and Applied Sciences, 12(16), 4102-4107. [8]Vassiliadis, P. (2009). A survey of extract–transform–load technology. International Journal of Data Warehousing and Mining (IJDWM), 5(3), 1-27. [9]Chu, X., Ilyas, I. F., Krishnan, S., & Wang, J. (2016, June). Data cleaning: Overview and emerging challenges. In Proceedings of the 2016 international conference on management of data (pp. 2201-2206). [10]Lenzerini, M. (2002, June). Data integration: A theoretical perspective. In Proceedings of the twenty-first ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems (pp. 233-246). [11]Kusiak, A. (2001). Feature transformation methods in data mining. IEEE Transactions on Electronics packaging manufacturing, 24(3), 214-221. [12]Weisstein, E. W. (2015). Fast fourier transform. https://mathworld. wolfram. com/. [13]Song, Y. Y., & Ying, L. U. (2015). Decision tree methods: applications for classification and prediction. Shanghai archives of psychiatry, 27(2), 130. [14]Breiman, L. (2001). Random forests. Machine learning, 45, 5-32. [15]O'shea, K., & Nash, R. (2015). An introduction to convolutional neural networks. arXiv preprint arXiv:1511.08458. [16]Chen, T., He, T., Benesty, M., Khotilovich, V., Tang, Y., Cho, H., ... & Zhou, T. (2015). Xgboost: extreme gradient boosting. R package version 0.4-2, 1(4), 1-4. [17]Schölkopf, B. (1997). Support vector learning (Doctoral dissertation, Oldenbourg München, Germany). [18]Ito, K., & Kunisch, K. (2008). Lagrange multiplier approach to variational problems and applications. Society for Industrial and Applied Mathematics. [19]Liu, F. T., Ting, K. M., & Zhou, Z. H. (2008, December). Isolation forest. In 2008 eighth ieee international conference on data mining (pp. 413-422). IEEE. [20]Kriegel, H. P., Kröger, P., Schubert, E., & Zimek, A. (2009, November). LoOP: local outlier probabilities. In Proceedings of the 18th ACM conference on Information and knowledge management (pp. 1649-1652). [21]Peña, D., & Prieto, F. J. (2001). Multivariate outlier detection and robust covariance matrix estimation. Technometrics, 43(3), 286-310. [22]Hubert, M., & Debruyne, M. (2010). Minimum covariance determinant. Wiley interdisciplinary reviews: Computational statistics, 2(1), 36-43. [23]Yang, L., & Shami, A. (2020). On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing, 415, 295-316. [24]Syarif, I., Prugel-Bennett, A., & Wills, G. (2016). SVM parameter optimization using grid search and genetic algorithm to improve classification performance. TELKOMNIKA (Telecommunication Computing Electronics and Control), 14(4), 1502-1509. [25]Bergstra, J., & Bengio, Y. (2012). Random search for hyper-parameter optimization. Journal of machine learning research, 13(2). [26]Shahriari, B., Swersky, K., Wang, Z., Adams, R. P., & De Freitas, N. (2015). Taking the human out of the loop: A review of Bayesian optimization. Proceedings of the IEEE, 104(1), 148-175. [27]Schulz, E., Speekenbrink, M., & Krause, A. (2018). A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions. Journal of Mathematical Psychology, 85, 1-16. [28]Luque, A., Carrasco, A., Martín, A., & de Las Heras, A. (2019). The impact of class imbalance in classification performance metrics based on the binary confusion matrix. Pattern Recognition, 91, 216-231. [29]Swets, J. A. (1988). Measuring the accuracy of diagnostic systems. Science, 240(4857), 1285-1293. [30]Sokolova, M., Japkowicz, N., & Szpakowicz, S. (2006, December). Beyond accuracy, F-score and ROC: a family of discriminant measures for performance evaluation. In Australasian joint conference on artificial intelligence (pp. 1015-1021). Berlin, Heidelberg: Springer Berlin Heidelberg. [31]Mahnke, W., Leitner, S. H., & Damm, M. (2009). OPC unified architecture. Springer Science & Business Media.
|