跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.90) 您好!臺灣時間:2024/12/05 18:46
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:邢世恬
研究生(外文):Shi-Tien Hsing
論文名稱:基於摺積神經網路於穩健評估皮膚類型之研究
論文名稱(外文):Robust skin type evaluation using convolutional neural networks
指導教授:張正春張正春引用關係
指導教授(外文):Cheng-Chun Chang
口試委員:高立人房同經
口試日期:2017-07-27
學位類別:碩士
校院名稱:國立臺北科技大學
系所名稱:電機工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:英文
論文頁數:62
中文關鍵詞:費氏量表機器學習摺積神經網路膚色量測
外文關鍵詞:Fitzpatrick scaleMachine learningConvolutional neural networksSkin tone measurement
相關次數:
  • 被引用被引用:0
  • 點閱點閱:298
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
皮膚的顏色在臨床醫學、皮膚醫學和化妝品科學中,因其容易觀察,間接暗示了個人的健康狀況,甚至影響到外觀具吸引力與否而成為被廣泛研究的議題。然而,受肉眼和主觀因素的影響,皮膚的細微顏色變化例如局部血流變化或黑色素量變化不容易單憑肉眼察覺。使用光譜儀量測皮膚的反射係數能夠更精確計算皮膚顏色。然而由於光譜儀極其敏感的性質,使用光譜儀量測皮膚時的量測架設、力道及部位變動,皆可能造成嚴重誤差。再者,皮膚顏色分佈是由複雜的生理構造所形成,且皮膚表面紋理、角質及毛髮都可能造成截然不同的量測結果。如何量測皮膚光譜及何謂具代表性皮膚光譜都是值得深入探討的學問。
有鑒於深度學習近年來在影像識別、語音辨識、高光譜影像分類的效果皆超越傳統的分類器,本論文提出使用一維摺積類神經網路來達成穩健的費式量表(Fitzpatrick scale)皮膚類別分類器,比較(a)X-Rite i1Pro (b)晶片微型光譜儀 所建置的皮膚光譜資料庫 及 (c)他方建置的皮膚光譜資料庫UWA-FSRD 的分類結果。相較於傳統皮膚醫學使用的ITA分類方式得到的分類結果:(a)81.94% (b)83.2% (c)77.89%,使用類神經網路的分類結果:(a)96.88% (b)86.7% (c)81.82%及使用本論文所提出的摺積類神經網路的分類結果:(a)92.59% (b)90.4% (c)84.94%,使用類神經網路及摺積類神經網路可提升大約10%的準確率。整體而言,在訓練樣本充足及均勻的資料庫中,本論文所提出的摺積類神經網路可得到最好的費式量表皮膚類別分辨率。
Skin spectrum is used in a wide range of applications including medical science, dermatology, cosmetics science, and biometric face recognition. However, it is noticed that the composition of complex tissue layers and the uneven outer surface of the skin make skin spectrum evaluation error-prone. In other words, the skin reflection spectra of the same measurement area from the same person could show different spectral characteristic. Since deep neural networks show robust classification in the areas of visual recognition, image labeling, speech recognition, hyperspectral image…etc. A Convolutional Neural Network (CNN) is introduced for studying robust Fitzpatrick scale classification.
Adapting CNN models used in the area of image classification, modified one-dimensional CNN is implemented into skin spectrum classification. The performances of traditional Individual Typology Angle (ITA) approach, Artificial Neural Networks (ANN) and CNN are compared. In our study, three data sets are used to exam our proposed Fitzpatrick scale assessment: (a)a self-built skin spectrum database using X-Rite i1Pro (b)a self-built skin spectrum database using on-chip micro sensor (c)a public facial spectral reflectance database. Comparing to the results of ITA approach which are (a)81.94% (b)83.2% (c)77.89%, ANN achieves the results of (a)96.88% (b)86.7% (c)81.82% and CNN achieves the results of (a)92.59% (b)90.4% (c)84.94%. With the machine learning approaches, the Fitzpatrick skin type classification rate can improve around 10%. Overall, with the sufficient training samples and evenly distributed class numbers, our proposed CNN model is a robust skin type classifier.
摘 要 i
ABSTRACT ii
誌 謝 iii
Contents iv
List of Tables vi
List of Figures vii
Chapter 1 INTRODUCTION 1
1.1 Preface 1
1.2 Motivation 2
1.3 Organization 5
Chapter 2 BACKGROUND KNOWLEDGE 6
2.1 Color space and L*a*b* 6
2.2 Conventional skin color evaluation system and skin classification 9
2.2.1 Fitzpatrick scale 9
2.2.2 Pantone skintone guide 10
2.2.3 The error-prone results of conventional skin classification 11
2.3 Machine learning 14
2.3.1 Artificial neural networks 14
2.3.2 Deep neural networks 24
2.3.3 Convolutional neural networks 24
Chapter 3 THE METHODOLOGY OF SKIN SPECTRUM CLASSIFICATION 31
3.1 Methodology of artificial neural network in spectrum classification 31
3.1.1 The proposed structure of artificial neural network in spectrum classification 31
3.1.2 Training procedures of ANN 33
3.2 Methodology of convolutional neural networks in spectrum classification 34
3.2.1 The proposed structure of convolutional neural networks in spectrum classification 34
3.2.2 Training procedures of CNN 37
3.3 System overview 38
Chapter 4 EXPERIMENTAL RESULTS 39
4.1 Experiment setup 39
4.2 Training data sets 40
4.2.1 Self-build skin database1 42
4.2.2 Self-build skin database2 44
4.2.3 Public skin database – UWA-FSRD 46
4. 3 Experimental results and discussion 50
Chapter 5 CONCLUSION AND FUTURE WORKS 57
5.1 Conclusions 57
5.2 Future Works 58
REFERENCES 59
[1]A. R. Webb and O. Engelsen. “Calculated Ultraviolet Exposure Levels for a Healthy Vitamin D Status.” Photochemistry and Photobiology, vol. 82, no. 6, pp. 1697, 2006.
[2]M. F. Holick. “Vitamin D and Sunlight: Strategies for Cancer Prevention and Other Health Benefits.” Clinical Journal of the American Society of Nephrology, vol. 3, no. 5, pp. 1548–1554, 2008.
[3]B. Querleux (ed.). Computational Biophysics of the Skin. Serra Mall, CA: Pan Stanford Publishing, 2016, pp. 38-41.
[4]M. Doi, R. Ohtsuki, S. Tominaga. “Spectral Estimation of Skin Color with Foundation Makeup,” in SPIE/IS&T Electronic Imaging, 2006, pp. 70-82.
[5]M. Uzair, A. Mahmood, F. Shafait, C. Nansen, A. Mian. “Is spectral reflectance of the face a reliable biometric?” Optics Express, vol. 23, no. 12, pp. 15160-15173, 2015.
[6]“X-Rite CAPSURE.” Internet: http://www.xrite.com/product_overview.aspx?ID=1302 [Apr. 17, 2017].
[7]Y. LeCun, Y. Bengio, G. Hinton. “Deep learning.” Nature, vol. 521, no. 7553, pp. 436-444, 2015.
[8]A. H. Munsell. “A pigment color system and notation.” The American Journal of Psychology, vol. 23, no. 2, pp. 236-244, 1912.
[9]“Precise color communication – Konica Minolta.” Internet: http://www.konicaminolta.eu/fileadmin/content/eu/Measuring_Instrume [Jun. 05, 2017].
[10]“CIE 1931 color space.” Internet: https://en.wikipedia.org/wiki/CIE_1931_color_space [Jun. 05, 2017].
[11]“Lab color space,” Internet: https://en.wikipedia.org/wiki/Lab_color_space [Jun. 05, 2017].
[12]“Fitzpatrick skin type.” Internet: http://www.arpansa.gov.au/pubs/RadiationProtection/FitzpatrickSkinType.pdf [Apr. 17, 2017].
[13]”Pantone SkinTone Guide.” Internet: https://www.pantone.com/skintone-guide [Apr. 17, 2017].
[14]J. Dlugos and J. Taylor. “Visible reflectance spectroscopy of human skin: the use of CIE L* a* b* Color analysis for in vivo Ethnic skin characterization.” Internet: https://www.google.com.tw/search?q=Visible+reflectance+spectroscopy+of+human+skin%3A+the+use+of+CIE+L*+a*+b*+Color+analysis+for+in+vivo+Ethnic+skin+characterization&oq=Visible+reflectance+spectroscopy+of+human+skin%3A+the+use+of+CIE+L*+a*+b*+Color+analysis+for+in+vivo+Ethnic+skin+characterization&aqs=chrome..69i57.715j0j9&sourceid=chrome&ie=UTF-8 [Aug. 22, 2017].
[15]“Color IQ,” Internet: http://www.sephora.com/color-iq [Apr. 17, 2017].
[16]“Color IQ Numbers Meaning at Sephora • r/MakeupAddiction,” Internet: https://www.reddit.com/r/MakeupAddiction/comments/34vyn7/color_iq_numbers_meaning_at_sephora/ [Apr. 07, 2017].
[17]“Machine learning,” Internet: https://en.wikipedia.org/wiki/Machine_learning [Jun. 05, 2017].
[18]S. Agatonovic-Kustrin, R. Beresford. “Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research.” Journal of pharmaceutical and biomedical analysis, vol.22, no.5, pp. 717-727, 2000.
[19]M. T. Hagan and M. B. Menhaj. “Training feedforward networks with the Marquardt algorithm,” IEEE Transactions on Neural Networks, vol.5, no.6, pp. 989-993, 1994.
[20]Ö. Kişi and E. Uncuoğlu. “Comparison of three back-propagation training algorithms for two case studies.” Indian Journal of Engineering and Materials sciences, vol.12, pp.434-442, 2005.
[21]A. G. Ivakhnenko. “Polynomial theory of complex systems.” IEEE Transactions on Systems, Man, and Cybernetics, vol.1, no.4, pp. 364-378, 1971.
[22]“CS231n Convolutional Neural Networks for visual recognition,” Internet: http://cs231n.github.io/convolutional-networks/ [Jun. 21, 2017].
[23]Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, “Gradient-based learning applied to document recognition.” in Proceedings of the IEEE, vol.86, no.11, 1998 pp.2278-2324.
[24]“Convolutional neural network,” Internet: https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/02_Convolutional_Neural_Network.ipynb [Jun. 25, 2017].
[25]X.vGlorot, A. Bordes, Y. Bengio. “Deep sparse rectifier neural networks,” In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, 2011, pp. 315-323.
[26]Y. Chen, Z. Lin, X. Zhao, G. Wang, Y. Gu. “Deep learning-based classification of hyperspectral data.” IEEE Journal of Selected topics in applied earth observations and remote sensing, vol.23 No.6 , pp. 2094-2107, 2014.
[27] D. Ciregan, U. Meier, J. Schmidhuber. “Multi-column deep neural networks for image classification,” In Computer Vision and Pattern Recognition (CVPR) IEEE Conference, 2012, pp. 3642-3649.
[28]“Tensorflow,” Internet: https://www.tensorflow.org/ [Jun. 26, 2017].
[29]D. Kingma, J. Ba. “Adam: A method for stochastic optimization.” arXiv: 1412.6980, 2014.
[30]S. Ruder. “An overview of gradient descent optimization algorithms.” arXiv preprint arXiv:1609.04747, 2016
[31]W. Hu, Huang, L. Wei, F. Zhang, and H. Li, “Deep convolutional neural networks for hyperspectral image classification. ” Journal of Sensors, Vol. 2015, 2015.
[32]“ColorChecker classic,” Internet: http://www.xrite.com/categories/calibration-profiling/colorchecker-classic [Jun. 26, 2017].
[33]“NSP32, Spectral Sensing Platform for IoT,” Internet: https://nanolambda.myshopify.com/ [Jul. 13, 2017].
[34]“Beauty beyond basic skin types: a pragmatic approach,” Internet: https://www.prime-journal.com/beauty-beyond-basic-skin-types-a-pragmatic-approach/ [Aug. 23, 2017].
[35]“Neural Network Toolbox,” Internet: https://www.mathworks.com/products/neural-network.html [18-Jul. 18, 2017].
[36]“WS-1-SL Diffuse Reflectance Standard,” Internet: https://oceanoptics.com/product/ws-1-reflectance-standards/ws-1-sl-product/ [Jul. 18, 2017].
[37]“Skin Colour Database,” Internet: http://files.cie.co.at/699_Report%20for%20CIE%20R1-56.pdf [Jul. 19, 2017].
[38]“A Step by Step Backpropagation Example,” Internet: https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ [Aug. 22, 2017].
[39]S. Del Bino, and F. Bernerd. “Variations in skin colour and the biological consequences of ultraviolet radiation exposure.” British Journal of Dermatology Vol. 169, Suppl. 3, pp. 33-40, 2013.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊