跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.83) 您好!臺灣時間:2024/12/09 16:17
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:蘇榮智
研究生(外文):Jung-Chic Su
論文名稱:自動化光學檢測之研究
論文名稱(外文):A Study of Automated Optical Inspection
指導教授:唐永新唐永新引用關係
指導教授(外文):Yeong-Shin Tarng  
學位類別:博士
校院名稱:國立臺灣科技大學
系所名稱:機械工程系
學門:工程學門
學類:機械工程學類
論文種類:學術論文
論文出版年:2006
畢業學年度:94
語文別:英文
論文頁數:152
中文關鍵詞:測直線測圓缺陷分類結構光刀腹磨耗砂輪微型鑽頭喇叭震模變阻器測邊自動化光學檢測機器視覺
外文關鍵詞:Machine visionAutomated optical inspection
相關次數:
  • 被引用被引用:3
  • 點閱點閱:602
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:2
本論文主要的目的是利用機器視覺發展一自動化檢測系統,以提昇製造廠品質管制能力。本研究達成三個項目(1)檢測元件的位置尺寸是否正確,(2)元件的外型是否在接受範圍內,(3)元件的外觀是否有瑕疵。自動化光學檢測包含下列步驟: 擷取影像,影像處理,特徵擷取和判決。本文使用四個應用實例來印證所提出的特徵擷取的方式及對產品瑕疵的辨識,這四個實例分別為(A)二維的輪廓檢測-主要的目的量測砂輪磨耗量的改變,藉此來判別砂輪是否需要再重磨,(B)使用測邊的方式量測微型鑽頭的刀腹磨耗,(C)三維的曲面檢測-使用結構光之照明方式檢測喇叭震模的高度及同心度是否符合要求,及(D)變阻器外觀檢測,主要從影像中擷取此元件特徵,再將這些特徵透過ANFIS訓練,訓練完畢後再對Varistors的缺陷加以分類。由這四個應用實例顯示,本研究所發展的機器視覺系統具有很良好重現精度,及優越瑕疵分類能力。
This thesis presents a machine vision system for automated inspection of industrial parts for post-manufacturing quality control. The aims of the visual inspection process are to determine whether all components have correctly located dimensions, whether all components shaped within acceptable tolerances, to check for structural damage, and to inspect the surface quality of components for defects. Automated optical inspection processes involve the following sequence of steps: image acquisition, image processing, exaction feature, and decision-making. All this must be accomplished while ensuring that overall completion time is comparable to that of a human inspector. We describes, with applications: (1) a two-dimensional contour measurement algorithm using back lighting to solve the problem of measuring wear on a grinding wheel, (2) an automated flank wear measurement scheme with edge detection based on machine vision for a microdrill (3) three-dimensional shape measurement using a structured illumination method to measure the dimensions of a loudspeaker cone, and (4) an automated visual system for inspecting the surface appearance of ring varistors based on an adaptive neuro-fuzzy inference system (ANFIS). The experimental results show that the machine vision system can inspect or classify these components in a highly consistent and accurate manner, and can be a valuable tool for ensuring product quality.
Contents
Abstract I
中文摘要 II
Acknowledgements III
List of Tables XIV
Chapter 1 Introduction 1
1.1 Background 1
1.2 Related researches 2
1.3 The study of motivations and purposes 3
1.4 Contributions 4
1.5 Organization 5
Chapter 2 Machine Vision 6
2.1 A typical machine vision system 6
2.2 Illumination 7
2.2.1 Lighting technology 8
2.3 Image sensors 9
2.3.1 Point scanning 10
2.3.2 Line scanning 11
2.3.3 Area scanning 12
Chapter 3 Automated Visual Inspection Fundamentals 14
3.1 Four types of inspection on industrial vision 15
3.1.1 Inspection of dimensional quality 17
3.1.2 Inspection of surface quality 18
3.1.3 Inspection of structural quality 19
3.1.4 Inspection of accurate operational quality 20
3.2 Feature extraction 20
3.3 Edge detection 21
3.4 Two-dimensional edge detection 24
3.5 Line detection 25
3.5.1 Mean square distance 25
3.5.2 Least-squares method 26
3.6 Circle detection 26
Chapter 4 Fuzzy and Neural Network 30
4.1 Fuzzy logic fundamentals 30
4.1.1 Membership function 31
4.1.2 Fuzzy inference models 33
4.1.3 Sugeno fuzzy models (TSK fuzzy model) 34
4.1.4 Tsukamoto fuzzy models 36
4.2 Artificial neural networks (ANNs) 38
4.2.1 Backpropagation neural network 40
4.2.2 Error backpropagation 41
4.2.3 Hybrid learning rule: combining steepest descent and LSE 42
4.2.4 Training pattern modeling 44
Chapter 5 Case Study 1 ─ Measuring Wear of the Grinding Wheel Using Machine Vision 46
5.1 Introduction 46
5. 2 Principle of measurement 48
5.2.1 Experimental Set-up 49
5.2.2 Illumination-Backlighting 49
5.2.3 Calibration 50
5.3 Experimental procedures 52
5.4. Results and discussion 59
5.5. Conclusions 63
Chapter 6 Case Study 2 ─An Automated Flank Wear measurement of Microdrills Using Machine Vision 65
6.1 Introduction 65
6.2 Experimental set-up 67
6.3 Measuring method 69
6.3.1 Measuring procedures 70
6.3.2 Measuring the flank wear area 72
6.3.3 Measuring the average wear height VBave 73
6.3.4 Measuring maximum wear height VBmax 74
6.4 Measurement results and discussion 74
6.5 Conclusions 80
Chapter 7 Case Study 3 ─ Application of the Structured Illumination Method for Automated Optical Inspection of the Loudspeaker Cones 81
7.1 Introduction 81
7.2 Set-up of an automated visual inspection system 84
7.3 Measurement process 87
7.3.1 Image acquisition 88
7.3.2 Triangulation-based and calibration 88
7.3.3 Stripe identification and feature extraction using edge detection 90
7.3.4 Measuring the concentricity of the cone 90
7.3.3 Measuring the height of the cone 92
7.4 Results and discussion 96
7.5 Conclusions 103
Chapter 8 Cause Study 4 ─ Automated Visual Inspection for Surface Appearance Defects of Varistors Using an Adaptive Neuro- Fuzzy Inference System 104
8.1 Introduction 104
8.2. Adaptive Neuro-Fuzzy Inference System (ANFIS) 106
8.2.1 ANFIS Architecture 107
8.2.2 ANFIS learning algorithm 109
8.3 Automated visual inspection system 110
8.3.1 Structure 111
8.3.2 Feature extraction method 112
8.3.3 Image Mask 115
8.3.4 Unwrap 116
8.3.5 Feature types 117
8.4. Modeling using ANFIS 120
8.4.1 Establish a Sugeno fuzzy model 120
8.4.2 Training the ANFIS 121
8.4.3 Assignment using the minimum orthogonal distance method 122
8.5 Results and discussion 122
8.6 Conclusion 127
Chapter 9 Conclusions and Future Work 128
References 130
Appendix B 141
B.1 Training mode 141
B.2 Inspection mode 142
Appendix C 143
C.1 Mechanism 143
C.2 Image acquisition module 146
C.3 Controller 146
C.4 Hierarchical control 147
C.5 Communication 149
C.6 Light meter 150
About author 151


List of Figures
Fig. 2.1 A typical machine vision system. 7
Fig. 2.2 Variability in appearance due to differences in illumination. 8
Fig. 2.3 Various illumination technologies (a) directed lighting (b) back lighting (c) vertical lighting (d) structured lighting. 9
Fig. 2.4 Point Scanner 10
Fig 2.5 Image acquisition using a linear sensor strip 12
Fig. 2.6 Area Scanning 13
Fig. 3.1 One-dimensional and two-dimensional edge detection. 24
Fig. 3.2 Determine search region of circular detector by six control parameters. 27
Fig. 4.1 Fuzzy set with three bell-shaped membership functions 33
Fig. 4.2 A two-input first-order Sugeno fuzzy model 35
Fig. 4.3 A first-order Sugeno fuzzy model with two nonlinear inputs and one linear output. 36
Fig. 4.4 The Tsukamoto fuzzy model. 37
Fig. 4.5 Single-input/output Tsukamoto fuzzy model (a) antecedent MFs; (b) consequent MFs; (c) each rule’s output cuve; (d) overall input-output curve. 38
Fig. 4.6 Feedforward neural network. 40
Fig. 4.7 Activation functions for backpropagation MLPs: (a) logistic function; (b) hyperbolic function; (c) identity function. 41
Fig. 5.1 A specimen is ground to yield a gap with the contour of the grinding wheel. 48
Fig. 5.2 Set-up of a measuring system 49
Fig. 5.3 System set-up based on a back lighting 50
Fig. 5.4 Standard circle with Dt=0.7mm to be calibrated using the least square method. 52
Fig. 5.5 The centers of the two arcs must align on the X-axis. 52
Fig. 5.6 The specimen image is captured using the back lighting method, (a)The angel between the straight edge of the two shoulder width perpendicular to the horizontal axis,θ0, is 85.2�a.(b) The image is rotated an angle of 4.8�a such that the straight edge of the specimen becomes perpendicular to the horizontal axis. 54
Fig. 5.8 Calculate as the starting point for measurement. 56
Fig. 5.9 (a) 90 points obtained by circular edge detection in the upper quadrant,(b) 90 points obtained by circular edge detection in the lower quadrant. 59
Fig. 5.10 Repeatability of the measurement: (a) Radius R1, (b) Radius R2, (c) Radius , (d) Radius (e) Radius (f) Radius (g)Radius (h)Radius 62
Fig. 6.1 Schematic diagram of a microdrill 66
Fig. 6.2 Experimental set-up for the measuring flank wear using the toolmaker microscope. 69
Fig. 6.3 The images of cutting plane with flank wear. (a) Rotate the image to horizontal. (b) The search region is confined by the red rectangle to avoid noise effect in the measurement results. 71
Fig. 6.4 The contour of cutting plane (a) original contour of the cutting plane (b) the worn-out of cutting plane (c) original image with edge detection search lines (d) the image of flank wear with edge detection search lines. 72
Fig. 6.5 Flank wear measurement results. 77
Fig. 6.6 The height of the cutting plane change along X-axis in hole-drilling test. 78
Fig. 6.7 The curves of the flank area wear. 79
Fig. 6.8 The curves of the average wear height. 79
Fig. 6.9 The curves of the maximum wear height. 80
Fig. 7.1 The appearance of the loudspeaker cone, (a) the cone of loudspeaker consists of the internal and external two layers of elastic materials, (b) standard sample (c) the inner and outer two layers are bound with an angular deviation. 82
Fig. 7.2 A non-contact 3D measurement system for inspecting the loudspeaker cone. 85
Fig. 7.3 Structure of the measurement system 87
Fig. 7.4 Triangulation geometry and calibration 89
Fig. 7.5 Measuring the concentricity using three images 92
Fig. 7.6 Measuring the height using a laser stripe, (a) reference line is plotted in red line, search area is restricted in green frame, and edge points are plotted with red mark, and (b) tracking method uses two edge-detection lines to detect the change grey intensity. 95
Fig. 7.7 The profile of the loudspeaker cone 96
Fig. 7.8 Using a single image computes the internal and external centers of the circle to estimate the concerntricity by fitting the circle using the least square method. 100
Fig. 7.9 Measurement of the results uses triangulation with laser stripe to compute the height of the loudspeaker cone at different position with different slices of the range image. (a) Measuring the height of the loudspeaker cone on the right hand region, (c) on the middle region, (e) on the left region, and (b), (d), and (f) with a curve of the shape profile are piloted along the Y-axis. 101
Fig. 7.10 Repeatability of the height measurements 102
Fig. 8.1 Six types of the ring varistors 106
Fig. 8.2 ANFIS architecture for a two-input, two-rule Sugeno FIS (A square represents the adaptive note and a circle denotes the fixed note). 107
Fig. 8.3 The following chart of the defect classification. 111
Fig. 8.4 Fuzzy rule architecture of ANFIS model with feature extraction. 112
Fig. 8.5 Feature extraction (a) image mask, (b) unwrapped, (c) two-dimensional edge detection, (d) summing up the number edge points (red points) in rectangular region with d width. (Red points mean where the edge detected using two-dimensional edge detection). 113
Fig. 8.7 (c) Front qualify pattern (d) broken pattern are histogram of discrete feature values. 115
Fig. 8.8 (e) Back qualify pattern (f) cracked pattern are histogram of discrete feature values. 115
Fig. 8.9 (a) Circular edge detection (b) image mask (c) center of circle of the image mask and center overlap of circular detector (d) extract masked region. 116
Fig. 8.10 Discrete feature values are defined in a histogram. 118
Fig. 8.11 ANFIS architecture for a four-input Sugeno model with six rules. 120
Fig.8.12 Adaptation of step sizes from initial value 0.11 (right most) to final value 5.73 . 123
Fig. 8.13 RMSE curves for ANFIS 124
Fig. 8.14 (a),(b) MFs before learning; (c), (d) MFs after learning in input 1 ( ) and input 2( ). 125
Fig. 8.15 (a),(b) MFs before learning; (c), (d) MFs after learning in input 3 ( ) and input 4( ). 125
Fig C.1 Relationship chart of training mode and inspection mode 143
Fig. B-1 The structure of a varistors inspection system 145
Fig. B-3 Schematic diagram of the hierarchical control in the automated visual inspection 149


List of Tables
Table 3.1 the features of inspected products 16
Table 5.1 The results of various factors cases errors test 63
Table 6.1 The relationship between the height of cutting plane and the number of hits. 75
Table 7.1 Measurement results using multi-image with local feature test repeatability of the concentricity. 97
Table 7.2 Measurement results using only a single image test repeatability of the concentricity 99
References
[1]Gordon, G. G.., “Automated glass fragmentation analysis,” Machine Vision Applications in Industrial Inspection IV, Procedings of the SPIE, San Jose, CA, pp. 2665-2675 (1996).
[2]Caron, J., Duvieubourg, L., and Postaire, J. G.., “A hyperbolic filter for defect detection in packaging industry,” In Int. Conf. on Quality Control and artificial Vision, Le Creusot, French, pp. 207-211 (1997).
[3]Brzakovic, D. and Vujovic, N., “Designing defect classification system: a cause study,” Pattern Recognition Vol.29, pp.1401-1419 (1992).
[4]Fernandze, C., Platero, C., Campany, P. and Aracil, R., “Vision system for online surface inspection in aluminum casting process,” Proceedings of the IEEE International Conference on Industrial Electrics, Control, Instrumentation and Automation (IECON’93), pp.1854-1859 (1993).
[5]Caron, J., Duvieubourg, L. J., Orteu, J. and Revolte, J. G.., “Automatic inspection system for strip of preweathered zinc,” In Int. Conf. on Applications of photonic Technology, Montréal, Canada, pp.571-576 (1997).
[6]Torres, T., Sebastian, J.M., Aracil, R., Jimenez, L.M., Reinoso, O., “Automated real-time visual inspection system for high-resolution superimposed printings, “Image and Vision Computing, Vol.16, pp.947–958 (1998).
[7]Guerra, E. and Villalobos, J.R., “Three-dimensional automated visual inspection system for SMT assembly,” Computers and Industrial Engineering Vol. 40, pp. 175-190 (2001).
[8]Chou, P. B., Rao, A. R., and Wu, F. Y., “Automatic defect classification for semiconductor manufacturing,” Machine Vision and Applications, pp.201-214 (1997).
[9]Ojala, T., Pietikäinen, M., and Silven, O., “Edge-based texture measures for surface inspection. Processing of the 11th International Conference on Pattern Recognition,” pp.594-598 (1992).
[10]Conners, R. W., Mcmillin, C. W., Lin, K. and Vasquez-Espinosa, R. E.,
“Identifying and locating surface defects in wood: Part of an Automated Lumber Processing System,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. PAMI-5, pp.573-583 (1983).
[11]Anzalone, A., Frucci, M. Machi, A. and Baja, G. S. d.,” Parallel implementation on a MIMD machine of a PCB computer assisted inspection method,” In 6th International Conference on Image Analysis and Processing: Progress in Image Analysis and Processing II, pp.679–687 (1991).
[12]Wilson, D., Greig, A., Gilby, and J., Smith, R., “Using uncertainty techniques to aid defect classification in an automated visual inspection system,” Industrial Inspection, IEE Colloquium, pp.2/1-2/10 (1997).
[13]Kashitani, A., Takanashi, N., and Tagawa, N.,” A solder joint inspection system for surface mounted pin grid arrays,” In Proceeding of the IEEE International Conference on Industrial electronics, and Instrumentation (IECON) ’93, Maui, HA, pp.1865-1870 (1993).
[14]Hattori, T., Nakada, T., Kataoka, M., Nakada, I. M., and Kataoka, I., “A high speed image processor oriented for automated visual inspection system,” Systems Engineering, IEEE International Conference, pp.640 – 643 (1992).
[15]Li, H. and Lin, J.C., “Using fuzzy logic to detect dimple defects of polished wafer surfaces,” IEEE Transactions on Industry Applications 30, pp.1530–1543 (1994).
[16]Lee, M. R. et al., “Machine vision system for curved surface inspection,” Machine Vision and Applications, Vol. 12, pp.177-188 (2000).
[17]Sarigul, E., Abbott, A. L., and Schmoldt, D. L.,” Rule-driven defect detection in CT images of hardwood logs,” Computers and Electronics in Agriculture, Vol.41, pp.101-119 (2003).
[18]Chang, J. G., Han, Valverde, J.M. Griswold, N.C., Duque-Carrillo, J.F. and Cork, S.E., “Quality classification system using a unified image processing and fuzzy-neural network methodology,” IEEE Transactions on Neural Networks, Vol.8, pp. 964–973 (1997).
[19]Sarkodie-Gyan, T. Lam, C.W. Hong, D., and Campbell, A.W., “An efficient object recognition scheme for a prototype component inspection,” Mechatronics, Vol.7, pp.185–197 (1997).
[20]Chen, Y.H., “Computer vision for general purpose visual inspection: a fuzzy logic approach,” Optics and Lasers in Engineering, Vol.2, pp.2181–192 (1995).
[21]Bose, N.K. and Liang, P. Neural Network Fundamentals with Graphs, Algorithms, and Applications, McGraw-Hill, New York (1996).
[22]Malamasa, E. N., Petrakisa, E. G. M., Zervakis, M., Petitb L. and Legat, J. D., “A survey on industrial vision systems, applications and tools,” Image and Vision Computing, Vol.21 pp.171–188 (2003)
[23]Bahlmann, C. and Heidemann, G., H. Ritter, “Artificial neural networks for automated quality control of textile seams,” Pattern Recognition 32, pp.1049–1060 (1999).
[24]Cootes, T.F., Page, G.J. Jackson, C.B. and Taylor, C.J., “Statistical grey level models for object location and identification,” Image and Video Computing Vol.14 pp.533–540 (1996).
[25]Kim, K.H., Kim, Y.W. and Suh, S.W., “Automatic visual inspection system to detect wrongly attached components,” International Conference on Signal Processing Applications and Technology (ICSPAT’98) (1998).
[26]Velten, J., Kummert, A. and Maiwald, D., “Real time railroad tie inspection implemented on DSP and FPGA boards,” International Conference on Signal Processing Applications and Technology (ICSPAT’99) (1999).
[27]Jeng, J.Y., Mau, T.F., and Leu, S.M., “Gap inspection and alignment using a vision technique for laser butt joint welding,” International Journal of Advanced Manufacturing Technology Vol.16, pp.212–216 (2000).
[28]Moreira, M., Fiesler, E. and Pante, G., “Image classification for the quality control of watches,” Journal of Intelligent and Fuzzy Systems Vol. 7, pp.151–158 (1999).
[29]Sonka, M., Hlavac, V. and Boyle, R, “Image Processing, Analysis, and Machine Vision, PWS Publishing,” New York. (1999).
[30]Van Gool, L., Wambacq, P. and Oostterlinck, A, “Intelligence robotic vision systems,” In Intelligent Robotic system, Dekker, New York, pp.457-507 (1991).
[31]Morii, F., “Distortion Analysis on Discrete Laplacian Operators by Introducing Random Images,” Image and Graphics, 2004. Proceedings. Third International Conference on, pp.80- 83(2004).
[32]Zhao, F, deSilva, C.J.S., “Engineering in Medicine and Biology Society,” Proceedings of the 20th Annual International Conference of the IEEE, 2(29), pp.812 – 815(1998).
[33]Pellegrino, F. A., Vanzella, W. and Torre, V., “Edge Detection Revisited,” IEEE Transactions on systems, man, and cybernetics, Vol. 34, pp.3-10 (2004).
[34]Kovesi, P., “Image features from phase congruency,” In Videre. Cambridge, MA: MIT Press, Vol. 1, pp.1–26(1999).
[35]Morrone, M. C. and Burr, D., “Feature detection in human vision: A phase-dependent energy model,” In Proc. Royal Soc. London B, pp.221–245 (1988).
[36]Brunnstrom, K., Lindeberg, T. and Eklundh, J. O., “Active detection and classification of junctions,” In Proc. 2nd Eur. Conf. Computer Vision, St.Margherita Ligure, Italy, pp.701–709 (1992).
[37]Hough, P.V.C. “Method and Means for Recognizing,” Complex Patterns, U.S. Patent 3,069,654, Dec. 18 (1962).
[38]Duda, R.O. and Hart, P. E. “Use of the Hough Transformation to detect lines and curves in pictures,” Common. ACM, 15(1): pp.11-15(1973).
[39]Chen, T. C. and Chung, K. L., “A new randomized algorithm for detecting Lines,” Real-Time Imaging, Vol.7, pp.473-481 (2001).
[40]Duda, R. O. and Hart, P. E., “Use of the Hough transformation to detect lines and curves in pictures,” Comm. Assoc. Compute. Mach. 15, pp.11–15 (1972).
[41]Tien, F. C. and Yen, C. H. “Using eigenvalues of coveriance for automated visual inspection of mocrodrills,” Int J Adv Manuf Technol, Vol. 26, pp.741-749 (2005).
[42]Shimizu, M. and Pkutomi, M, “An analysis of sun-pixel estimation error on area-based image matching,” Digital Signal Processing, DSP 14th International Conference, Vol. 2, pp.1239-1242 (2002).
[43]Schwartz, W. H., “Vision System for Pc Board Inspection,” Assembly Engineering, Vol.29, No.8, pp.8-21(1986).
[44]Moganti, M. and Ercal, F., “Automatic PCB inspection system,” IEEE Potentials, Vol. 14, No. 3, pp. 6-10(1995).
[45]Chou, P. B., Rao, A. R., Sturzenbecker, M. C., Wu, F. Y., and Brecher, V. H., “Automatic defect classification for semiconductor manufacturing”, Machine Vision and Applications 9, pp.201–214 (1997).
[46]Loh, H. H. and Lu, M. S., “SMD inspection using structured light,” Proceedings of the 1996 IEEE IECON.22nd International Conference on Industrial Electronics, Control, and Instrumentation, Vol.2, pp.1076-1081 (1996).
[47]Xu, L., Oja, E., and Kultanan, P., “A new curve detection method: Randomized Hough Transform (RHT),” Pattern Recog. Lett. 11, pp.331–338 (1990).
[48]Xu, L. and Oja, E., “Randomized Hough transform (RHT): Basic mechanisms, algorithms, and computational complexities,” CVGIP: Image Understanding 57, pp.131–154 (1993).
[49]Sugeno, M. and Kang, G. T., “Structure identification of fuzzy model,” Fuzzy set and Systems, Vol.28, pp.15-33 (1988).
[50]Takagi, T. and Sugeno, M., “Fuzzy identification of systems and its applications to modeling and control,” IEEE Transactions on Systems, Man, and Cybernetics, pp.116-132 (1985).
[51]Mamdani, E. H. and Assilian, S., “An experiment in linguistic synthesis with a fuzzy logic controller,” International of Man-Machine Studies. Vol.7, No.13, pp.1-13 (1975).
[52]Tsukamoto, Y., “An approach to fuzzy reasoning method,” Advanced in fuzzy set theory and applications, pp. 137-149 (1973).
[53]Drake, P.R. and Packianather, M.S. “A decision tree of neural networks for classifying images of wood veneer,” International Journal of Advanced Manufacturing Technology, Vol.14, pp.280–285 (1998).
[54]Tsai, D.M., Chen, J.J. and Chen, J.F., “A vision system for surface roughness assessment using neural networks,” International Journal of Advanced Manufacturing Technology, Vol.14, pp.412–422 (1998).
[55]Kim, T.H., Cho, T.H., Moon, Y.S., and Park, S.H., “Visual inspection system for the classification of solder joints,” Pattern Recognition, Vol.32, pp.565–575. (1999).
[56]Jang, J.–S, R., Sun, C.T., and Mizutani, E. Neuro-Fuzzy and soft computing. Pearson Education Taiwan Ltd, Taipei, pp. 104-106.
[57]Furutani, K., Ohguro, N., Hieu, N. T., and Nakamura, T., “In-process measurement of topography change of grinding wheel by hydrodynamic pressure,” International Journal of Machine Tools & Manufacture, pp. 1447-1453 (2002).
[58]Furutani, K., Ohguro, N., Hieu, N. T. and Nakamura, T., “Automatic compensation for grinding wheel wear by pressure based in-process measurement in wet grinding,” Precision Engineering, pp. 9-13 (2003).
[59]Mokbel, A. A. and Maksoud, T. M., “Monitoring of the condition of diamond grinding wheels using acoustic emission technique,” Journal of Materials Processing Technology, pp. 292-297 (2000).
[60]Susič, E., Mužič, P., and Grabec, I., “Description of ground surfaces based upon AE analysis by a neural network,” Ultrasonic, pp. 547-549 (1997).
[61]Lachance, S., Bauer, R. and Warkentin, A., “Application of region growing method to evaluate the surface condition of grinding wheels,” International Journal of Machine Tools and Manufacture, pp. 823-829 (2004).
[62]Sodhi, M. S. and Tiloquine, K., “Surface roughness monitoring using computer vision,” International Journal of Machine Tools and Manufacture, pp. 817-828 (1996).
[63]Weng, J., Cohen, P., and Herniou, M., “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14, pp. 965–980 (1992).
[64]Tsai, R.Y., “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the shelf TV cameras and lenses,” IEEE Int. J. Robot. Automation. RA-3, pp. 323–344 (1987).
[65]Lee, M. R. et al. “Machine vision system for curved surface inspection,” Machine Vision and Applications, Vol.12 pp.177-188 (2000).
[66]Griffiths, B. J., Middleton, R. H. and Wilkie, B. A, “Condition monitoring of the grinding process using light scattering,” Wear, pp.39-45 (1996).
[67]Loh, H. H., and Lu, M. S, “Printed circuit board inspection using image analysis,” Industry Applications IEEE Transactions on, 35(2) pp.426-432 (1999).
[68]Liang, X. P. and Su, X. Y, “Computer Simulation of a 3-D Sensing System with Structured illumination,” Optics and Lasers in Engineering, Vol.27 pp.379-393 (1997).
[69]Meershoek, L. S., and Schamhardt, H. C., “Oblique scaling: an algorithm to correct foe a non-perpendicular camera view in tendon strain measurements,” Journal of Biomechanics, pp. 1529-1536 (2000).
[70]Lee, B. Y., Liu, H. S. and Tarng, Y. S., “An abductive network for predicting tool life in drilling,” Industry Applications, IEEE Transactions, 35 (1) pp.190-195 (1999).
[71]Everson, C. and Cheraghi, S. H., “The application of acoustic emission for precision drilling process monitoring,” International Journal of Machine Tools and Manufacture, 39(3) pp.371-387 (1999).
[72]Ertunc, H. M. and Oysu, C. “Drill wear monitoring using cutting force signals,” Mechatronics, 14(5) pp.533–548 (2004).
[73]El-Wardany, T.I., Gao, D., and Elbestawi, M.A., “Tool condition monitoring in drilling using Vibration signature analysis,” International Journal of Machine Tools & Manufacture, 36 (6) pp.687–711 (1996).
[74]Hayashi, S.R., Thomas, C.E. and Wildes, D.G., “Tool break detection by monitoring ultrasonic vibrations,” Annals of the CIRP, 37 (1) pp.61–64 (1988).
[75]Dimla, DE, “The correlation of vibration signal features to cutting tool wear in a metal turning operation,” Introduction Journal Advanced Manufacturing Technology, 19(10) pp.705–713 (2002).
[76]Lin, S.C. and Ting, C.J., “Tool wear monitoring in drilling using force signals,” Wear, 180 (1) pp.53–60 (1995).
[77]Ertunc, H.M. and Loparo, K.A., “A decision fusion algorithm for tool wear condition monitoring in drilling,” International Journal of Machine Tools & Manufacture, 41 pp.1347–1362 (2001).
[78]Li, X., Dong, S. and Venuvinod, P. K., “Hybrid learning for tool wear monitoring,” Introduction Journal Advanced Manufacturing Technology, Vol.16 pp.303–307 (2000).
[79]Nickel, J., Shuaib, A.N., Yilbas, B.S. and Nizam, S.M., “Evaluation of the wear of plasma-nitrided and TiN-coated HSS drills using conventional and Micro-PIXE techniques,” Wear, Vol.239 pp.155–167 (2000).
[80]Pedersen, K. B., “Wear measurement of cutting tools by computer vision,” Int. J. Machine tools Manufacturing, 30(1) pp.131-139 (1990).
[81]Jeon, J. U. and Kim, S. W., “Optical flank wear monitoring of cutting tools by image processing,” Wear, Vol.127 pp.207-117 (1988).
[82]L. Hazra et al., “Inspection of reground drill point geometry using three silhouette images,” Journal of Materials Processing Technology, 127(2) pp.169-173 (2002).
[83]Ramirez, C.N. and Thornh, R. J., “Automated measurement of flank wear of circuit board drills,” ASME Journal of Electric Packaging, Vol.114 pp.93-96 (1992).
[84]Jang, J.-S. R., “Self-learning fuzzy controllers based on temporal backpropagation,” IEEE Trans. Neural Networks 3 (5), pp.714–723 (1992).
[85]Jang, J.-S. R., “ANFIS: Adaptive-network-based fuzzy inference system,” IEEE Trans. Syst. Man Cybern. 23 (3), pp.665–685 (1993).
[86]Gonzales, R. C. and Woods, R. E., “Digital Image Processing,” Prentice Hall, New Jersey. pp. 50-52 (2002).
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top