跳到主要內容

臺灣博碩士論文加值系統

(44.220.247.152) 您好!臺灣時間:2024/09/20 19:55
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:張景哲
研究生(外文):CHANG, CHING-CHE
論文名稱:應用機器學習演算法於寵物飼料之異味強度分類
論文名稱(外文):Odor Intensity Classification in Pet Food Using Machine Learning Algorithms
指導教授:林志哲林志哲引用關係
指導教授(外文):LIN, CHIH-JER
口試委員:陳金聖陳介力林志哲
口試委員(外文):CHEN, CHIN-SHENGCHEN, CHIEH-LILIN, CHIH-JER
口試日期:2024-07-17
學位類別:碩士
校院名稱:國立臺北科技大學
系所名稱:人工智慧科技碩士學位學程
學門:電算機學門
學類:軟體發展學類
論文種類:學術論文
論文出版年:2024
畢業學年度:112
語文別:中文
論文頁數:91
中文關鍵詞:電子鼻異味預測異味管理少量數據數據平衡
外文關鍵詞:E-NoseOdor predictionOdor managementSmall datasetData balancing
相關次數:
  • 被引用被引用:0
  • 點閱點閱:19
  • 評分評分:
  • 下載下載:2
  • 收藏至我的研究室書目清單書目收藏:0
寵物飼料在蒸煮與烘乾過程會產生大量含氮化合物、含硫化合物廢氣,若無妥善處理,將造成異味與空氣污染等嚴重問題。生產廠通常以設定化學洗滌塔內的水電導度閥值來添加藥劑來除去異味,但這種自動化方式往往無法有效應對因生產變化導致的異味波動。為有效改善此問題並提升周圍居民的生活品質,本研究建立一套自動數據收集系統,透過收集電子鼻數值、洗滌塔水電導度與氣象數據進行即時監測。結合廠區內人員於設定在廠區內的四個方位點配合自動化數據收集系統進行異味強度的評估,做為數據標籤值。利用視覺化圖表進行數據分析,再依各點位之異味影響因素執行特徵工程與挑選,應用五種機器學習演算法來進行異味強度值的分類比較,同時以One-Class Support Vector Machine進行數據異常檢測。由於高異味之數據量少,導致數據不平衡,本研究透過過採樣(Over-sampling)和欠採樣(Under-sampling)技術平衡原始數據。最終達到有效預測異味強度,並減少在少量數據情況下發生過擬合的情形,廠區內邊界點位之異味強度分類準確率皆高於85%,主要點位之準確率達到87.98%。
The cooking and drying processes in pet food manufacturing emit significant amounts of nitrogen and sulfur compounds as exhaust gases, leading to serious odor issues and air pollution if not properly treated. Traditionally, manufacturers have controlled odors by setting conductivity thresholds in chemical scrubbers to regulate chemical additions. However, this automated system often fails to adequately adjust to production changes, impacting its effectiveness in odor management. To address this issue and enhance the living quality of nearby residents, this study establishes an automatic data collection system. It gathers real-time data from electronic noses, water conductivity sensors in scrubbing towers, and meteorological sensors. The system integrates these data with odor intensity assessments conducted at four strategic plant locations by staff members, using these assessments as labels for data analysis.
The study employs visual charts for data analysis and performs feature engineering and selection based on the odor impact factors at each location. Five types of machine learning algorithms are applied to compare predictions of odor intensity, and One-Class Support Vector Machine for anomaly detection. Due to the low volume of high-odor data, which leads to data imbalance, this research utilizes Over-sampling and Under-sampling techniques to balance the original dataset. The results demonstrate effective prediction of odor intensities while reducing the risk of overfitting, despite the small data volume. The classification accuracy for odor intensity at the plant boundary exceeds 85%, with the main classify point achieving an accuracy of 87.98%.

摘要 i
ABSTRACT ii
誌謝 iv
表目錄 viii
圖目錄 x
1 第一章 緒論 1
1.1 研究動機與背景 1
1.2 執行步驟 1
1.3 論文架構 2
2 第二章 文獻回顧與技術探討 3
2.1 異味處理與表示 3
2.2 電子鼻飄移現象 7
2.3 數據類別平衡 8
3 第三章 實驗系統架構與數據收集 10
3.1 工廠環境與洗滌塔介紹 10
3.2 電子鼻介紹與選擇 12
3.3 自動化數據系統設置 15
3.3.1 感測器數位化與自動化收集 18
3.3.2 異味表示與評估 20
3.3.3 原始數據介紹 21
4 第四章 數據預處理與機器學習 24
4.1 電子鼻數據飄移計算 24
4.2 數據分析與特徵工程 26
4.3 標準化(Standardization) 30
4.4 降維(Dimensionality Reduction) 31
4.4.1 Principal Component Analysis (PCA) 31
4.4.2 Uniform Manifold Approximation and Projection (UMAP) 32
4.5 數據平衡(Data Balancing) 33
4.5.1 Synthetic Minority Over-sampling Technique (SMOTE) 34
4.5.2 Borderline-SMOTE 35
4.5.3 Adaptive Synthetic (ADASYN) 36
4.5.4 Tomek-links 37
4.6 監督式機器學習(Supervised Learning) 38
4.6.1 Random Forest (RF) 39
4.6.2 Support Vector Machine (SVM) 39
4.6.3 K Nearest Neighbor (KNN) 40
4.7 非監督式機器學習(Unsupervised Learning) 41
4.7.1 K-means Clustering 42
4.7.2 Gaussian Mixture Model (GMM) 42
4.7.3 One-Class Support Vector Machine 43
5 第五章 實驗結果與分析 45
5.1 預處理與特徵挑選 45
5.1.1 電子鼻飄移數據處理結果 45
5.1.2 溫溼度特徵調整結果 46
5.1.3 特徵選擇 47
5.2 異味預測結果評估 56
5.2.1 數據平衡結果分析 56
5.2.2 分類結果評估 70
5.3 數據異常檢測結果 78
6 第六章 結論與未來展望 81
6.1 總結 81
6.2 延續發展 82
參考文獻 84



1.農業部動物保護資訊網,家犬、家貓數量調查。https://animal.moa.gov.tw/Frontend/Know/PageTabList?TabID=31B05CB46007226417F0F5FB8A80096E#tab2。

2.財政部統計處,財政統計通報 第3號。https://service.mof.gov.tw/public/Data/statistic/bulletin/112/%E7%AC%AC3%E8%99%9F-%E5%AF%B5%E7%89%A9.pdf。

3.中華民國環境工程學會,環境工程會刊第20卷第2期,空氣汙染,工業排氣異味控制案例,周明顯,2009。

4.Leonardos, G., Kendall, D., & Barnard, N. (1969). Odor Threshold Determinations of 53 Odorant Chemicals. Journal of the Air Pollution Control Association, 19(2), 91–95. https://doi.org/10.1080/00022470.1969.10466465

5.Gardner, J. W., & Bartlett, P. N. (1994). A brief history of electronic noses. Sensors and Actuators B: Chemical, 18(1-3), 210-211.

6.Gosain, A., & Sardana, S. (2017, September). Handling class imbalance problem using oversampling techniques: A review. In 2017 international conference on advances in computing, communications and informatics (ICACCI) (pp. 79-85). IEEE.

7.周家德(2005)。飼料乾燥排氣化學洗滌除臭。﹝碩士論文。國立中山大學﹞臺灣博碩士論文知識加值系統。 https://hdl.handle.net/11296/t92n2u。

8.吳靜怡(2009)。含硫異味物質之化學及生物氧化。﹝碩士論文。國立中山大學﹞臺灣博碩士論文知識加值系統。 https://hdl.handle.net/11296/c74rgs。

9.吳亞臻(2017)。高效率滴濾式生物濾床及化學氧化處理含揮發性有機物排氣。﹝博士論文。國立中山大學﹞臺灣博碩士論文知識加值系統。 https://hdl.handle.net/11296/hrwg62。

10.行政院環境保護署環境保護人員訓練所,「空氣汙染防制專責人員訓練教材」,異味產生源控制方法概論,How-Ming Lee,2023。

11.Green BG, Dalton P, Cowart B, Shaffer G, Rankin K, Higgins J. Evaluating the 'Labeled Magnitude Scale' for measuring sensations of taste and smell. Chem Senses. 1996 Jun;21(3):323-34. doi: 10.1093/chemse/21.3.323. PMID: 8670711.

12.楊欣怡(2015)。污泥焚化排氣之化學洗滌法除臭。﹝碩士論文。國立中山大學﹞臺灣博碩士論文知識加值系統。 https://hdl.handle.net/11296/554cqx。

13.Brewer, M. Susan, and Keith R. Cadwallader. "Overview of odor measurement techniques." Urbana 51 (2004): 61801.

14.Sarkar, Ujjaini, and Stephen E. Hobbs. "Odour from municipal solid waste (MSW) landfills: A study on the analysis of perception." Environment international 27.8 (2002): 655-662.

15.Holmberg, Martin, et al. "Drift counteraction for an electronic nose." Sensors and Actuators B: Chemical 36.1-3 (1996): 528-535.

16.Bharne, Pankaj K., V. S. Gulhane, and Shweta K. Yewale. "Data clustering algorithms based on swarm intelligence." 2011 3rd international conference on electronics computer technology. Vol. 4. IEEE, 2011.

17.Zhang, Lei, and David Zhang. "Domain adaptation extreme learning machines for drift compensation in E-nose systems." IEEE Transactions on instrumentation and measurement 64.7 (2014): 1790-1801.

18.Zhang, Lei, et al. "A novel semi-supervised learning approach in artificial olfaction for E-nose application." IEEE Sensors Journal 16.12 (2016): 4919-4931.

19.Rehman, Atiq Ur, et al. "Multi-classifier tree with transient features for drift compensation in electronic nose." IEEE Sensors Journal 21.5 (2020): 6564-6574.

20.ur Rehman, Atiq, and Amine Bermak. "Heuristic random forests (HRF) for drift compensation in electronic nose applications." IEEE Sensors Journal 19.4 (2018): 1443-1453.

21.ur Rehman, Atiq, et al. "Salp Swarm Algorithm for Drift Compensation in E-nose." 2023 15th International Conference on Advanced Computational Intelligence (ICACI). IEEE, 2023.


22.魏敏如(2022)。處理不平衡資料之方法比較。﹝碩士論文。國立中興大學﹞臺灣博碩士論文知識加值系統。 https://hdl.handle.net/11296/86b827。

23.Mohammed, Roweida, Jumanah Rawashdeh, and Malak Abdullah. "Machine learning with oversampling and undersampling techniques: overview study and experimental results." 2020 11th international conference on information and communication systems (ICICS). IEEE, 2020.

24.Bahrami, Mahsa, Mansour Vali, and Hanif Kia. "Breast Cancer Detection from Imbalanced Clinical Data: A Comparative Study of Sampling Methods." 2023 30th National and 8th International Iranian Conference on Biomedical Engineering (ICBME). IEEE, 2023.

25.Eqibuana. "How to Deal with Imbalanced Data in Classification Tasks? " Medium, 2021. https://eqibuana.medium.com/how-to-deal-with-imbalanced-data-in-classification-tasks-1046e5be0e0

26.聖傑環保科技有限公司,洗滌塔系統。https://sj-bloc.com/%E6%B4%97%E6%BB%8C%E5%A1%94%E7%B3%BB%E7%B5%B1/

27.Branigan, Benjamin, and Prasanna Tadi. "Physiology, olfactory." StatPearls [Internet]. StatPearls Publishing, 2023.

28.Science Clarified, Smell. http://www.scienceclarified.com/Ro-Sp/Smell.html

29.SHINYEI technology, IAQ Sensing, handheld odor meter. https://www.shinyei.co.jp/stc/eng/products/iaq/odor.html

30.Klein, Markus. "MODBUS TCP/IP."

31.Nebraska Diary Extension, "What Weather Conditions Cause Neighbors to Experience Odor? " https://dairy.unl.edu/documents/Odor%20Fact%20Sheets.pdf

32.Melse, R. W., and N. W. M. Ogink. "Air scrubbing techniques for ammonia and odor reduction at livestock operations: Review of on-farm research in the Netherlands." Transactions of the ASAE 48.6 (2005): 2303-2313.

33.Berg-Munch, B., and P. O. Fanger. "The influence of air temperature on the perception of body odor." Environment International 8.1-6 (1982): 333-335.

34.Guo, H., et al. "Simulation of odor dispersions as impacted by weather conditions." Livestock Environment VI, Proceedings of the 6th International Symposium 2001. American Society of Agricultural and Biological Engineers, 2001.

35.Douglas W. Hamilton, J. D. Carlson. "Movement of Odors Off-Farm" Oklahoma State University, 2019. https://extension.okstate.edu/fact-sheets/movement-of-odors-off-farm.html.

36.Tran, Quang Duc, and Panos Liatsis. "A Modified Equal Error Rate Based User-Specific Normalization for Multimodal Biometrics." 2013 Sixth International Conference on Developments in eSystems Engineering. IEEE, 2013.

37.Kappal, Sunil. "Data normalization using median median absolute deviation MMAD based Z-score for robust predictions vs. min–max normalization." Lond. J. Res. Sci. Nat. Form 19.10.13140 (2019).

38.ROCHESTER, weather blog, "Lake Effect Guide Based on Wind Direction", Christine Gregory, 2021.

39.GES DISC, "Derive Wind Speed and Direction With MERRA-2 Wind Components", Dana Ostrenga, 2019.

40.台中市沙鹿區公所,沙鹿鎮誌。https://www.shalu.taichung.gov.tw/1364684/post 。

41.Alpaydin, Ethem. Introduction to machine learning. MIT press, 2020.

42.Niharika, Gunukula, and Annam Indhu Lekha. "Predicting Survival of People with Heart Failure Using Oversampling, Feature Selections and Dimensionality Reduction." 2022 IEEE 7th International Conference on Recent Advances and Innovations in Engineering (ICRAIE). Vol. 7. IEEE, 2022.

43.Arnold, C., et al. "Sub-surface probe module equipped with the Karlsruhe Micronose KAMINA using a hierarchical LDA for the recognition of volatile soil pollutants." Sensors and Actuators B: Chemical 116.1-2 (2006): 90-94.

44.McInnes, Leland, John Healy, and James Melville. "Umap: Uniform manifold approximation and projection for dimension reduction." arXiv preprint arXiv:1802.03426 (2018).

45.Patil, Sachin Subhash, and Shefali Pratap Sonavane. "Enriched over_sampling techniques for improving classification of imbalanced big data." 2017 IEEE third international conference on big data computing service and applications (BigDataService). IEEE, 2017.

46.Chawla, Nitesh V., et al. "SMOTE: synthetic minority over-sampling technique." Journal of artificial intelligence research 16 (2002): 321-357.

47.Fernando López. " SMOTE: Synthetic Data Augmentation for Tabular Data " Medium, 2021.

48.Han, Hui, Wen-Yuan Wang, and Bing-Huan Mao. "Borderline-SMOTE: a new over-sampling method in imbalanced data sets learning." International conference on intelligent computing. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005.

49.He, Haibo, et al. "ADASYN: Adaptive synthetic sampling approach for imbalanced learning." 2008 IEEE international joint conference on neural networks (IEEE world congress on computational intelligence). Ieee, 2008.

50.Tomek, Ivan. "Two modifications of CNN." (1976).

51.Zeng, Min, et al. "Effective prediction of three common diseases by combining SMOTE with Tomek links technique for imbalanced medical data." 2016 IEEE International Conference of Online Analysis and Computing Science (ICOACS). IEEE, 2016.

52.MaDi’s Blog, ML/DL資料前處理,2020。https://dysonma.github.io/2020/12/05/ML-DL-%E8%B3%87%E6%96%99%E5%89%8D%E8%99%95%E7%90%86/。

53.Dietterich, Thomas G. "Ensemble learning." The handbook of brain theory and neural networks 2.1 (2002): 110-125.

54.Tony Yui. " Understanding Random Forest " Medium, 2019. https://towardsdatascience.com/understanding-random-forest-58381e0602d2 .

55.Brudzewski, Kazimierz, et al. "Classification of gasoline with supplement of bio-products by means of an electronic nose and SVM neural network." Sensors and Actuators B: Chemical 113.1 (2006): 135-141.

56.Analytics Yogi, Support Vector Machine (SVM) Python Example, Ajitesh Kumar, 2023. https://vitalflux.com/classification-model-svm-classifier-python-example/ .

57.Nour AI-Rahman AI-Serw. " K-nearest Neighbor: The maths behind it, how it works and an example " Medium, 2021. https://medium.com/analytics-vidhya/k-nearest-neighbor-the-maths-behind-it-how-it-works-and-an-example-f1de1208546c .

58.Krishna, K., and M. Narasimha Murty. "Genetic K-means algorithm." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 29.3 (1999): 433-439.


59.Wan, Huan, et al. "A novel gaussian mixture model for classification." 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC). IEEE, 2019.

60.Leevy, Joffrey L., et al. "One-Class Classifier Performance: Comparing Majority versus Minority Class Training." 2023 IEEE 35th International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2023.

61.Li, Zhenchuan, et al. "A hybrid method with dynamic weighted entropy for handling the problem of class imbalance with overlap in credit card fraud detection." Expert Systems with Applications 175 (2021): 114750.

62.Ghafoori, Zahra, et al. "Unsupervised parameter estimation for one-class support vector machines." Advances in Knowledge Discovery and Data Mining: 20th Pacific-Asia Conference, PAKDD 2016, Auckland, New Zealand, April 19-22, 2016, Proceedings, Part II 20. Springer International Publishing, 2016.

63.Sickit learn, One-class SVM with non-linear kernel(RBF). https://scikit-learn.org/stable/auto_examples/svm/plot_oneclass.html#sphx-glr-auto-examples-svm-plot-oneclass-py .

64.Shahapure, Ketan Rajshekhar, and Charles Nicholas. "Cluster quality analysis using silhouette score." 2020 IEEE 7th international conference on data science and advanced analytics (DSAA). IEEE, 2020.

65.Santos, Jorge M., and Mark Embrechts. "On the use of the adjusted rand index as a metric for evaluating supervised classification." International conference on artificial neural networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009.

66.Estévez, Pablo A., et al. "Normalized mutual information feature selection." IEEE Transactions on neural networks 20.2 (2009): 189-201.

67.Krstinić, Damir, et al. "Multi-label classifier performance evaluation with confusion matrix." Computer Science & Information Technology 1 (2020): 1-14.


68.Myerson, Joel, Leonard Green, and Missaka Warusawitharana. "Area under the curve as a measure of discounting." Journal of the experimental analysis of behavior 76.2 (2001): 235-243.

69.Ramadan, Montaser NA, et al. "Portable AI-powered spice recognition system using an eNose based on metal oxide gas sensors." 2023 International Conference on Smart Applications, Communications and Networking (SmartNets). IEEE, 2023.


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top