(3.236.222.124) 您好!臺灣時間:2021/05/08 07:24
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:蘇翊甄
研究生(外文):Yi-Jhen Su
論文名稱:基於 LSTM/GRU 於塗佈機之異常偵測
論文名稱(外文):Anomaly Detection of the Coating Machine base on LSTM/GRU Approaches
指導教授:陳振明陳振明引用關係
指導教授(外文):Jen-Ming Chen
學位類別:碩士
校院名稱:國立中央大學
系所名稱:工業管理研究所
學門:商業及管理學門
學類:其他商業及管理學類
論文出版年:2020
畢業學年度:108
語文別:中文
論文頁數:56
中文關鍵詞:深度學習時間序列長短期記憶網路門控循環單元異常偵測預測性維護
外文關鍵詞:Deep LearningTime SeriesLong Short-Term MemoryGated Recurrent UnitAnomaly DetectionPredictive Maintenance
相關次數:
  • 被引用被引用:0
  • 點閱點閱:45
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
近年人工智慧的快速發展,促使了工業 4.0 的演進,現今大多數工廠都已漸漸導入智慧製造系統,也代表著機器的穩定度對工廠生產是相當的重要。以往工廠對設備的維護為修復性維護或預防性維護,此兩種方式對於維護成本是相當大的,且可能造成設備在無預警的狀況下發生異常停止,為了不讓設備在無預警的狀況下發生異常,近年來已漸漸發展為預測性維護。基於前述問題,本研究主要動機為期望在設備發生故障前能夠準確偵測到異常,發出警訊提醒設備人員以進行設備維護,目的為提早預防機台異常的發生而導致停止生產之狀況。
本研究所使用數據為 A 公司所提供之塗佈機感測器數據,分別使用長短期記憶網路 (Long short-term memory, LSTM) 模型以及門控循環單元 (Gated Recurrent Unit, GRU) 模型的監督式學習來建立機台異常偵測系統,並以多對多的滑動窗口方法來進行模型的訓練及預測,加速模型訓練速度及降低計算的複雜度,以及針對不同模型做超參數的調整,再利用評價指標方法找出最佳的模型配置。
In recent years, rapid development of artificial intelligence has promoted the evolution of Industry 4.0. Nowadays, most factories have gradually introduced smart manufacturing system, which also represents the stability with the machine is important for factory capacity. In the past, the maintenance of equipment in the factory was corrective maintenance or preventive maintenance. These two methods are quite expensive for maintenance and it may cause the equipment to shut down abnormally without any warning. In order to prevent the equipment to shut down without warning, predictive maintenance has gradually developed in recent years.
As stated above problems, the motivation is to expect to detected the anomaly before the machine shut down, and alert equipment engineer to maintenance of equipment. We will use the A company’s coating machine sensor data to analysis the anomaly detection.
We will use supervised learning method base on the Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) model to build anomaly detection system, and using slide window to speed up the training time and reduce the computational complexity. We experiment the various Hyperparameter for the different model, and using confusion matrix for performance evaluation the model to get the most suitable Hyperparameter to optimize the model.
中文摘要 I
Abstract II
目錄 III
圖目錄 V
表目錄 VII
一、 緒論 1
1-1 研究背景與動機 1
1-2 研究的 2
1-3 研究構 3
二、 文獻探討 4
2-1 深度學習4
2-2 長短期記憶網路(LSTM) 6
2-3 激勵函數 (Activation Function) 9
2-4 門控循環單元 (GRU) 12
2-5 評價指標 (Evaluation) 13
三、 研究方法 15
3-1 問題定義 15
3-2 資料處理 16
3-3 模型設計 19
3-3-1 特徵標準化 (Feature Scaling) 21 VI
3-3-2 神經網路架構 22
3-3-3 損失函數 (Loss Function) 25
3-3-4 優化器 (Optimizer) 25
四、 實驗與分析 27
4-1 實驗環境與開發工具 27
4-2 實驗分析 28
4-2-1 資料集說明 28
4-2-2 實驗設計 29
4-2-3 實驗結果 31
五、 結論與未來展望 41
參考文獻 43
1. Ang, J. C., Mirzal, A., Haron, H., Hamed, H. N. A. (2016). Supervised, unsupervised, and semi-supervised feature selection: a review on gene selection. IEEE/ACM Transactions on Computational Biology and Bioinformatics, Vol. 13, no. 5, pp. 971–989.
2. Cachada, A., Barbosa, J., Leitño, P., Geraldcs, C., Deusdado, L., Costa, J., Teixeira, C., Teixeira, J., Moreira, A., Moreira, P., Romero, L. (2018). Maintenance 4.0: Intelligent and Predictive Maintenance System Architecture. IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Vol. 1, pp. 139–146.
3. Chandola, V., Banerjee, A., Kumar, V. (2009). Anomaly detection: A survey,” ACM computing surveys., Vol. 41, no. 3, pp. 15:1–15:58.
4. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y. (2014). Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. Proceedings of the Empiricial Methods in Natural Language Processing (EMNLP 2014).
5. Davis, J., Goadrich, M. (2006). The relationship between Precision-Recall and ROC curves. Proceedings of the 23rd international conference on Machine learning, pp. 233–240.
6. Deng, L. Yu, D. (2014). Deep Learning: Methods and Applications. Foundations and Trends in Signal Processing, Vol. 7, no. 3–4, pp. 197–387.
7. Di Persio, L., Honchar, O. (2017). Recurrent neural networks approach to the financial forecast of google assets. International Journal of Mathematics and Computers in Simulation, Vol. 11, pp. 7–13.
8. Ergen, T., Mirza, A. H., Kozat, S. S. (2017). Unsupervised and Semi-supervised Anomaly Detection with LSTM Neural Networks. Vol. 1, arXiv:1710.09207.
9. Fu, R., Zhang, Z., Li, L. (2016). Using LSTM and GRU Neural Network Methods for Traffic Flow Prediction. IEEE Youth Academic Annual Conference of Chinese Association of Automation (YAC), pp. 324–328.
10. Goodfellow, I., Bengio, Y., Courville, A. (2016). Deep Learning. MIT Press.
11. Graves, A. (2014). Generating sequences with recurrent neural networks.
arXiv:1308.0850v5.
12. Guo, Y., Liao, W., Wang, Q, Yu, L., Ji, T., Li, P. (2018). Multidimensional Time Series Anomaly Detection: A GRU-based Gaussian Mixture Variational Autoencoder Approach. Asian Conference on Machine Learning, pp. 97–112.
13. Hochreiter, S., Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, Vol. 9, pp. 1735–1780.
14. Jiao, R., Zhang, T., Jiang, Y., He, H. (2018). Short-Term Non-Residential Load Forecasting Based on Multiple Sequences LSTM Recurrent Neural Network. IEEE Access, Vol. 6, pp. 59438–59448.
15. Kingma, P. D., Ba, J. (2014). Adam: A Method for Stochastic Optimization. International Conference on Learning Representations.
16. Lasi, H., Fettke, P., Kemper, H. G., Feld, T., Hoffmanne, M., (2014). Industry 4.0. Business & Information Systems Engineering, Vol. 6, pp. 239.
17. LeCun, Y., Bottou L., Bengio Y., Haffner P., (1998). Gradient-Based Learning Applied to Document Recognition. Proceedings of the IEEE, Vol. 86, no. 11, pp. 2278–2324.
18. Li, S., Xie, Y., Farajtabar, M., Song, L. (2016). Detecting weak changes in dynamic events over networks. arXiv:1603.08981v2.
19. Malhotra, P., Vig, L., Shroff, G., Agarwal, P. (2015). Long Short Term Memory
Networks for Anomaly Detection in Time Series. European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Vol. 1, pp. 89–94.
20. Nielsen, M. (2015). Neural Networks and Deep Learning. Determination Press.
21. Nwankpa, C. E., Ijomah, W., Gachagan, A., Marshall, S. (2018). Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. arXiv:1811.03378v1.
22. Olah, C., Understanding LSTM networks. (2015).
Available from < http://colah.github.io/posts/2015-08-Understanding-LSTMs/ >
23. Pascanu, R., Mikolov, T., Bengio, Y. (2013) On the difficulty of training recurrent neural networks. Proceedings of International Conference on Machine Learning (ICML), pp. 1310–1318.
24. Raúl Gómez blog. (2018). Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names.
Available from < https://gombru.github.io/2018/05/23/cross_entropy_loss/ >
25. Stojanovic, L., Dinic, M., Stojanovic, N., Stojadinovic, A. (2016). Big-data- driven anomaly detection in industry (4.0): an approach and a case study. IEEE International Conference on Big Data, pp. 1647–1652.
26. Sutskever, I., Martens, J., Dahl, G., Hinton, G. (2013). On the importance of initialization and momentum in deep learning. Proceedings of the 30th International Conference on Machine Learning (ICML-13), Vol. 28, pp. 1139– 1147.
27. Tieleman, T., Hinton, G. (2012). Lecture 6.5 - rmsprop: Divide the gradient by a running average of its recent magnitude. COURSERA: Neural networks for machine learning, Vol. 4, no. 2, pp. 26–31.
28. Zhang, A., Lipton, Z. C., Li, M. and Smola, A. J. (2020). Dive into Deep Learning. Available from < https://d2l.ai/ >
29. Zhao, H., Sun, S., Jin, B. (2018). Sequential Fault Diagnosis based on LSTM
Neural Network. IEEE Access, Vol. 6, pp. 12929–12939. 45
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔