跳到主要內容

臺灣博碩士論文加值系統

(216.73.216.131) 您好!臺灣時間:2026/01/16 02:27
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:方恩霖
研究生(外文):Edreen Bryan Togonon Valdeavilla
論文名稱:使用局部二元圖樣和支持向量機進行膠囊內視鏡影像分類
論文名稱(外文):Classification of Capsule Endoscope Images Using Local Binary Patterns and Support Vector Machines
指導教授:繆紹綱繆紹綱引用關係
指導教授(外文):SHAOU-GANG MIAOU
學位類別:碩士
校院名稱:中原大學
系所名稱:電子工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2010
畢業學年度:98
語文別:英文
論文頁數:142
中文關鍵詞:算術平均值交叉驗證無線膠囊內視鏡支持向量機局部二進制圖樣
外文關鍵詞:arithmetic meanscross validationlocal binary patternsupport vector machineswireless capsule endoscopy
相關次數:
  • 被引用被引用:0
  • 點閱點閱:256
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
無線膠囊內視鏡是用於檢查大小腸的先進技術。有了這項技術,醫生可以檢查整條腸道,包括傳統內視鏡無法看到的部分。然而,來自這項新技術的一個問題是大量的圖像須要被檢視,造成醫生的負擔。本論文提出了一些方法供自動檢測系統之用,以減輕此負擔,其做法是找出像是疑似食糜堵塞,出血和潰瘍的膠囊內視鏡圖像。這些方法使用標準局部二進制圖樣(LBP)和Sobel-LBP 作為辨識特徵並使用支持向量機(SVM),nu-SVM(ν -SVM),和單類 SVM(OCSVM)作為分類器。為了比較起見,最近鄰分類器也一併考慮。
以10次5疊交叉驗證(80%的訓練集)的實驗結果顯示,結合圖像取樣 1,使用非重疊區塊的標準LBP,和OCSVM分類器優於其他方法。所產生的準確度為 98.71%,而計算時間在測試階段低於每幅圖像0.5秒,這在即時應用上是可以接受的。
此外,由於不平衡的數據集,本文還採用一種新的分類器性能測量稱為準確度的算術平均值(a-means)。它一般可用於平衡和不平衡數據集,可以完全反映隨機猜測的行為,並可以方便地用於成本敏感的分類和多類別的分類。



Wireless capsule endoscopy is the state-of-the-art technology used for the investigation of the large and small intestines. With this technology, a physician can examine the entire section of the intestines, including the blind section not reachable with a traditional endoscope. However, one problem derived from this new technology is the tremendous number of images that need to be inspected visually causing a burden to the physician. This thesis presents some methods for an automatic detection system to reduce this burden by identifying suspected capsule endoscope images such as chyme blockage, suspected blood indicator, and ulcers. These methods use a standard local binary pattern (LBP) and Sobel-LBP as recognition features and support vector machines (SVM), nu-SVM (ν-SVM), and one class SVM (OCSVM) as classifiers. For comparison, the nearest neighbor classifier is also considered.
Experimental results carried out on 10-runs of 5-fold cross validation (80% of the training dataset) showed that the combination of image downsampling by 1, standard LBP using non-overlapping blocks, and the OCSVM classifier outperforms other methods. The resulting accuracy is 98.71% and the computational time in the testing phase is below 0.5 seconds per image, which is acceptable for real time applications.
In addition, due to an imbalanced data set, this thesis also applies a new performance measurement for a classifier called arithmetic means (a-means) of accuracies. This can be generally used for the balanced and imbalanced data set, can perfectly reflect the random guessing behavior, and can be easily used in cost-sensitive classification and multiple-class classification.



摘要……………………………………….……………………...........................I
ABSTRACT……………………………………………………….……………………………...II
CONTENTS…………………………………………………….…………..………….III
LIST OF FIGURES………………………………………………….……..………….VI
LIST OF TABLES…………………………………………………………...………...IX
LIST OF SYMBOLS……………………………………………………..........….…XIV
ACKNOWLEDGMENT……………………………………………………….....XVIII
CHAPTER I. INTRODUCTION…………………………………………….…...........1
1.1 Background……………………………………………………………………...........1
1.2 Objective………………………………………………………………………...........4
1.3 Organization of the Thesis……………………………………………………………4
CHAPTER II. LITERATURE REVIEW………………………..……………….……5
2.1 Wireless Capsule Endoscopy…………………………………………………….…...5
2.2 Local Binary Patterns (LBP)…………………………………….……………………5
2.2.1 Standard LBP………………………………………………………..………………6
2.2.2 Sobel-LBP………………………………………………………………….……….7
2.3 k-Nearest Neighbor Classifier………………………………………………….……..8
2.4 Support Vector Machines (SVM)…………………………………….……………….9
2.4.1 Standard Support Vector Machine (SVM)………………………………..…………9
2.4.2 ν-Support Vector Machine (ν-SVM)…………………………………………..…...11
2.4.3 One Class Support Vector Machine (OCSVM)…...………………………….……12
2.5 Radial Basis Function (RBF)…………..…………………………………………….14
2.6 Cross Validation…………………………………….…………………………..……14
CHAPTER III. IMPLEMENTATION………………………………………….…….16
3.1 Downsampling and Region of Interest……………………………...……………….16
3.2 Local Binary Patterns (LBP)……………………………………….……..…………18
3.3 Nearest Neighbor Classifier………………………..……………..………………….29
3.4 Support Vector Machines (SVM)……………………………..….………………….30
CHAPTER IV. MEASUREMENT OF CLASSIFICATION PERFORMANCE..….34
4.1 Background…………………………………………..…….…………………….…..34
4.2 Accuracy……………………………………………..…….……………...........…....35
4.3 Kappa Value…………………………………………...……...……………………...36
4.4 F-measure……………………………………………….…….……….…………….37
4.5 Receiver Operating Characteristic (ROC) Space……….….….…………………….39
4.5.1 Interpretation of Classification Performance in ROC Space…..……...…………...39
4.5.2 Measurement of Classification Performance in ROC Space…..…………..............40
4.5.3 Further Discussion on ROC-based Measurements……………..……………….…45
4.6 Geometric Means (G-Means) of Accuracies……………………….....……………..46
4.7 Arithmetic Means (A-Means) of Accuracies……………………..……………….…46
4.7.1 Concept………………………………………………………….….……………...46
4.7.2 Properties………………………………………………………...….……………..48
4.7.3 Multiple-Class Classification with A-Means……………………….….…………..51
4.8 Numerical Examples…….……………………………………………...……………51
CHAPTER V. EXPERIMENTAL SETUP AND RESULTS……………….……..….53
5.1 Experimental Setup……………………………………..…………………………....53
5.2 Experimental Results and Discussion…………………………………………..……55
5.2.1 Classification Performance……………………………………………………...…55
5.2.2 Computational Time………………………………………………………..……...65
5.2.3 Discussion…………………………………………………………………….…....66
CHAPTER VI. CONCLUSIONS AND FUTURE WORKS………………................68
6.1 Conclusions………………………………………………………………..…….......68
6.2 Future Works…………………………………………………………………...........68
APPENDIX A. SUPPORT VECTOR MACHINES FORMULATION………….….69
A.1 Support Vector Machines (SVM)……………………………………………….…..69
A.1.1 Optimal Hyperplane for Linearly Separable Patterns………………………….….69
A.1.2 Optimal Hyperplane for Nonseparable Patterns……………………………...…...72
A.1.3 Optimal Hyperplane in the High-Dimensional Feature Space……………..……..76
A.2 ν-Support Vector Machine (ν-SVM)…………………………………………...……80
A.3 One Class Support Vector Machine (OCSVM)…………………………………......85
APPENDIX B. MATHEMATICAL PRINCIPLES……………………………..……88
B.1 Kernel Space…………………………………………………………………..…….88
B.2 Optimization Theory…………………………………………………………..…….88
APPENDIX C. NUMERICAL RESULTS………………………………………..…...91
C.1 Nearest Neighbor Classifier……………………………………………………..…..91
C.2 Standard Support Vector Machine……………………………………………..……96
C.3 ν-Support Vector Machine……………………………………………………..…..101
C.4 One Class Support Vector Machine……………….…………………………….....106
C.5 Computational Time……...………………...………...………..…………….…….112
REFERENCES………………………………………………………………….…….118
ABOUT THE AUTHOR……………………………………………………….……..122

List of Figures

Fig. 1. Parts of WCE: (1) Optical dome, (2) Lens holder, (3) Lens, (4) Illuminating LEDs, (5) CMOS imager, (6) Battery, (7) ASIC transmitter, and (8) Antenna [2].........................1
Fig. 2. Example of the capsule endoscopy images: (a) Normal, (b) Chyme blockage, (c) Suspected blood indicator (SBI), and (d) White spots or ulcers.........................................2
Fig. 3. System block diagram: (a) Training process and (b) Testing process......................3
Fig. 4. A typical capsule associated with wireless capsule endoscopy................................5
Fig. 5. An example of LBP value estimation: (a) an original 3 x 3 neighborhood, (b) the values of the pixels in the thresholded neighborhood, (c) binomial weights assigned to the corresponding pixels, and (d) the values of eight neighbor pixels are summed to obtain a single value for the corresponding pattern.............................................................6
Fig. 6. Example of k-nearest neighbor classifier [20]..........................................................9
Fig. 7. Illustration of the concept of SVM...........................................................................9
Fig. 8. Geometry interpretation of OCSVM [25]..............................................................13
Fig. 9. Mapping onto higher dimension feature space......................................................14
Fig. 10. Procedure of 3-fold cross-validation [30]............................................................15
Fig. 11. Image downsampling with a downsampling factor of (a) 1, (b) 2, and (c) 4. Here, only pixels denoted as black dots will be processed in the next step................................16
Fig. 12. Image downsampling results for Fig. 2: (a) downsampling by 1 (256 x 256), (b) downsampling by 2 (128 x 128), and (c) downsampling by 4 (64 x 64)...........................17
Fig. 13. Process of ROI extraction: (a) original image, (b) binarized image with a circular ROI, and (c) image after ROI extraction...........................................................................17
Fig. 14. ROI extraction flowchart......................................................................................18
Fig. 15. LBP of the 4 endoscopy images (normal, chyme blockage, suspected blood indicator, and white spots) with size 256 x 256: (a) original image after ROI extraction and luminous intensity of the image, (b) the corresponding LBP image, and (c) the binarized image.................................................................................................................19
Fig. 16. Steps of computing LBP values using overlapping blocks..................................20
Fig. 17. Steps of making LBP histogram using overlapping blocks..................................20
Fig. 18. Flowchart of generating LBP histogram using overlapping blocks.....................21
Fig. 19. Steps of computing LBP values using non-overlapping blocks...........................21
Fig. 20. Steps of making LBP histogram using non-overlapping blocks..........................22
Fig. 21. Flowchart of generating LBP histogram using non-overlapping blocks..............22
Fig. 22. Examples of LBP histogram using overlapping blocks feature (LBP histogram with 256 bins and downsampling factor of 1)...................................................................23
Fig. 23. Examples of LBP histogram using non-overlapping blocks feature (LBP histogram with 256 bins and downsampling factor of 1)..................................................23
Fig. 24. Steps of computing Sobel-LBP values using overlapping blocks........................24
Fig. 25. Steps of making Sobel-LBP histogram using overlapping blocks.......................25
Fig. 26. Flowchart of generating Sobel-LBP using overlapping blocks............................26
Fig. 27. Steps of computing Sobel-LBP values using non-overlapping blocks................26
Fig. 28. Steps of making Sobel-LBP using non-overlapping blocks.................................27
Fig. 29. Flowchart of generating Sobel-LBP using non-overlapping blocks....................28
Fig. 30. Examples of Sobel-LBP histogram using overlapping blocks feature (Sobel-LBP histogram with 512 bins and downsampling factor of 1)..................................................29
Fig. 31. Examples of Sobel-LBP histogram using non-overlapping blocks feature (Sobel-LBP histogram with 512 bins and downsampling factor of 1)..............................29
Fig. 32. Flowchart of generating nearest neighbor classifier testing phase.......................30
Fig. 33. Flowchart of generating standard SVM training phase........................................31
Fig. 34. Flowchart of generating ν-SVM training phase...................................................32
Fig. 35. Flowchart of generating OCSVM training phase.................................................33
Fig. 36. Flowchart of generating SVM and its variants testing phase...............................33
Fig. 37. ROC space [8]......................................................................................................39
Fig. 38. The ROC space and plots of the five prediction examples [32]...........................40
Fig. 39. Equal distance lines and the random guessing line in ROC space [8].................41
Fig. 40. The ROC Convex Hull [39].................................................................................42
Fig. 41. The ROC Convex Hull with iso-performance lines, α and β [39].......................43
Fig. 42. Three ROC curves representing excellent, good, and worthless tests.................44
Fig. 43. AUC: (a) Points from two classifiers and (b) AUC calculation of the given points [8].......................................................................................................................................45
Fig. 44. Same g-means value curves in ROC space [8]....................................................46
Fig. 45. The equal performance lines in a-means [8]........................................................47
Fig. 46. AUC of one point [8]............................................................................................49
Fig. 47. AUC of the midpoint [8]......................................................................................50
Fig. 48. Recognition performance obtained with different values of ν using ν-SVM and OCSVM classifiers............................................................................................................61
Fig. 49. Recognition performance obtained with different dataset ratios using OCSVM classifier.............................................................................................................................62
Fig. 50. The most misclassified images identified: (a) Normal image and (b) and (c) Abnormal images...............................................................................................................64
Fig. 51. An illustration of optimal hyperplane and support vectors..................................70
Fig. 52. Examples of soft margin classification: (a) Data point xi falls inside the region of separation, but on the right side of the decision surface and (b) Data point xi falls on the wrong side of the decision surface………………….……………...…………………....73
Fig. 53. Graphical representation of slack variables.........................................................74
Fig. 54. An illustration of separating hyperplane in the feature space [8].........................76
Fig. 55. The geometric interpretation of margin and slack variables in SVM..................77
Fig. 56. Toy Problem (task: to separate circles from disks) solved using ν-SVM classification, with parameter values ranging from ν = 0.1 (top left) to 0.8 (bottom right) [13]……………………………………………………………………………...……….82
List of Tables

Table 1. Confusion Matrix……………………………………………….....……………35
Table 2. Interpretation of Different Values of ………………………………………….36
Table 3. Interpretation of Different Values of AUC……………………………………...44
Table 4. Confusion Matrix for Multiple-Class Classification [8]..……………...……….51
Table 5. Measurement of Classification Performances of Balanced and Imbalanced Testing Data Sets [8]………………………………….………………………………….52
Table 6. Measurement of Classification Performances Correspond to Some Random Guessing Points in ROC Space [8]………………………………………………………52
Table 7. Number of Pixels in the ROI Using LBP and Sobel-LBP Features with its Corresponding Downsampling Factor Used......................................................................54
Table 8. Different Training and Testing Datasets Generated Using Cross-validation of 350 Capsule Endoscope Images……………...…………………….……………………54
Table 9. Average Variance of Normal and Abnormal Images Using LBP Features…......55
Table 10. Average Variance of Normal and Abnormal Images Using Sobel-LBP Features.............................................................................................................................55
Table 11. Recognition Performance Obtained with Different Numbers of Bin of LBP and Sobel-LBP Histograms and 10% of the Training Dataset………...……………………..56
Table 12. Recognition Performance Obtained with Different Numbers of Bin of LBP and Sobel-LBP Histograms and 12.5% of the Training Dataset……...……………………...56
Table 13. Recognition Performance Obtained with Different Numbers of Bin of LBP and Sobel-LBP Histograms and 20% of the Training Dataset……...………………………..57
Table 14. Recognition Performance Obtained with Different Numbers of Bin of LBP and Sobel-LBP Histograms and 50% of the Training Dataset…...…………………………..58
Table 15. Recognition Performance Obtained with Different Numbers of Bin of LBP and Sobel-LBP Histograms and 80% of the Training Dataset………...……………………..58
Table 16. Recognition Performance, Average Number of Support Vectors, and Margin Size Obtained with Different Values of ν Using ν-SVM Classifier.…………………......61
Table 17. Recognition Performance, Average Number of Support Vectors, and Margin Size Obtained with Different Values of ν Using OCSVM Classifier...………………….61
Table 18. Corresponding Number of Normal and Abnormal Images with Different Dataset Ratios………………………………………………………..…………………..62
Table 19. Recognition Performance Obtained with Different Dataset Ratios Using OCSVM Classifier……………………...………………………………………………..62
Table 20. Comparison of Different Evaluation Criteria Between the Previous Approach and the Proposed Approach……………………………………...………………………63
Table 21. Computer Specification………………………………………...……………..65
Table 22. Average Computational Time per Image in Seconds (LBP Histogram)…...….66
Table 23. Average Computational Time per Image in Seconds (Sobel-LBP Histogram)..66
Table 24. Interpretation of the Relationships Between Slack Variables and the Point Location as Example…...………………………………………………………………..77
Table 25. Summary of Inner-Product Kernels [21]………………………...……………79
Table 26. Fraction of Errors and SVs, Along with the Margins of Class Separation, for the Toy Example in Fig. 56 [13]……………………………………………………………..82
Table 27. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Overlapping Blocks Computation and 35 Training Images………………………91
Table 28. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Non-overlapping Blocks Computation and 35 Training Images…...…………….92
Table 29. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Overlapping Blocks Computation and 44 Training Images………………………92
Table 30. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Non-overlapping Blocks Computation and 44 Training Images……...………….93
Table 31. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Overlapping Blocks Computation and 70 Training Images……………………....93
Table 32. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Non-overlapping Blocks Computation and 70 Training Images…………...…….94
Table 33. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Overlapping Blocks Computation and 175 Training Images………………...…..94
Table 34. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Non-overlapping Blocks Computation and 175 Training Images…………......…95
Table 35. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Overlapping Blocks Computation and 280 Training Images……………………..95
Table 36. Recognition Performance Obtained from Nearest Neighbor Classifier Results Using Non-overlapping Blocks Computation and 280 Training Images…………...…...96
Table 37. Recognition Performance Obtained from Standard Support Vector Machines Results Using Overlapping Blocks Computation and 35 Training Images…..….………96
Table 38. Recognition Performance Obtained from Standard Support Vector Machines Results Using Non-overlapping Blocks Computation and 35 Training Images………....97
Table 39. Recognition Performance Obtained from Standard Support Vector Machines Results Using Overlapping Blocks Computation and 44 Training Images……...………97
Table 40. Recognition Performance Obtained from Standard Support Vector Machines Results Using Non-overlapping Blocks Computation and 44 Training Images…………98
Table 41. Recognition Performance Obtained from Standard Support Vector Machines Results Using Overlapping Blocks Computation and 70 Training Images……...………98
Table 42. Recognition Performance Obtained from Standard Support Vector Machines Results Using Non-overlapping Blocks Computation and 70 Training Images…………99
Table 43. Recognition Performance Obtained from Standard Support Vector Machines Results Using Overlapping Blocks Computation and 175 Training Images…...………..99
Table 44. Recognition Performance Obtained from Standard Support Vector Machines Results Using Non-overlapping Blocks Computation and 175 Training Images………100
Table 45. Recognition Performance Obtained from Standard Support Vector Machines Results Using Overlapping Blocks Computation and 280 Training Images………...…100
Table 46. Recognition Performance Obtained from Standard Support Vector Machines Results Using Non-overlapping Blocks Computation and 280 Training Images………101
Table 47. Recognition Performance Obtained from ν-Support Vector Machines Results Using Overlapping Blocks Computation and 35 Training Images ………………….....101
Table 48. Recognition Performance Obtained from ν-Support Vector Machines Results Using Non-overlapping Blocks Computation and 35 Training Images ...……...……...102
Table 49. Recognition Performance Obtained from ν-Support Vector Machines Results Using Overlapping Blocks Computation and 44 Training Images ……………...……..102
Table 50. Recognition Performance Obtained from ν-Support Vector Machines Results Using Non-overlapping Blocks Computation and 44 Training Images ………...……..103
Table 51. Recognition Performance Obtained from ν-Support Vector Machines Results Using Overlapping Blocks Computation and 70 Training Images ………………….....103
Table 52. Recognition Performance Obtained from ν-Support Vector Machines Results Using Non-overlapping Blocks Computation and 70 Training Images …………….....104
Table 53. Recognition Performance Obtained from ν-Support Vector Machines Results Using Overlapping Blocks Computation and 175 Training Images ………………...…104
Table 54. Recognition Performance Obtained from ν-Support Vector Machines Results Using Non-overlapping Blocks Computation and 175 Training Images ………….…..105
Table 55. Recognition Performance Obtained from ν-Support Vector Machines Results Using Overlapping Blocks Computation and 280 Training Images …………...………105
Table 56. Recognition Performance Obtained from ν-Support Vector Machines Results Using Non-overlapping Blocks Computation and 280 Training Images ……..……….106
Table 57. Recognition Performance Obtained from ν-Support Vector Machines Results Using Different Values of ν, Non-overlapping Blocks Computation, and 280 Training Images ………………………………………………………………...………………..106
Table 58. Recognition Performance Obtained from One Class Support Vector Machines Results Using Overlapping Blocks Computation and 35 Training Images……...……..106
Table 59. Recognition Performance Obtained from One Class Support Vector Machines Results Using Non-overlapping Blocks Computation and 35 Training Images………..107
Table 60. Recognition Performance Obtained from One Class Support Vector Machines Results Using Overlapping Blocks Computation and 44 Training Images……...……..107
Table 61. Recognition Performance Obtained from One Class Support Vector Machines Results Using Non-overlapping Blocks Computation and 44 Training Images………..108
Table 62. Recognition Performance Obtained from One Class Support Vector Machines Results Using Overlapping Blocks Computation and 70 Training Images…...………..108
Table 63. Recognition Performance Obtained from One Class Support Vector Machines Results Using Non-overlapping Blocks Computation and 70 Training Images………..109
Table 64. Recognition Performance Obtained from One Class Support Vector Machines Results Using Overlapping Blocks Computation and 175 Training Images……...……109
Table 65. Recognition Performance Obtained from One Class Support Vector Machines Results Using Non-overlapping Blocks Computation and 175 Training Images...…….110
Table 66. Recognition Performance Obtained from One Class Support Vector Machines Results Using Overlapping Blocks Computation and 280 Training Images……...…....110
Table 67. Recognition Performance Obtained from One Class Support Vector Machines Results Using Non-overlapping Blocks Computation and 280 Training Images…...….111
Table 68. Recognition Performance Obtained from One Class Support Vector Machines Results Using Different Values of ν, Non-overlapping Blocks Computation, and 280 Training Images…...……………………………………………………...…………….111
Table 69. Recognition Performance Obtained from One Class Support Vector Machines Results Using Different Dataset Ratios, Non-overlapping Blocks Computation, and 280 Training Images………………...…………………………………………………...….111
Table 70. Average Computational Time per Image in Seconds Using Overlapping Blocks Computation……………………...…………………………………………………….112
Table 71. Average Computational Time per Image in Seconds Using Non-overlapping Blocks Computation………………………..……………………………………….…114

1.D. G. Adler and C. J. Gostout, “Wireless capsule endoscopy,” Hospital Physician, pp. 14-22, May 2003.
2.J. P. Cunha, M. Coimbra, P. Campos, and J. M. Soares, “Automated topographic segmentation and transit time estimation in endoscopic capsule exams,” IEEE Trans. on Medical Imaging, Vol. 27, No. 1, pp. 19-27, Jan. 2008.
3.M. T. Coimbra and J. P. S. Cunha, “MPEG-7 visual descriptor – contributions for automated feature extractor in capsule endoscopy,” IEEE Trans. on Circuit and Systems for Video Technology, vol. 16, no. 5, pp. 628-637, May 2006.
4.S. G. Miaou, F. L. Chang, I. K. Timotius, H. C. Huang, J. L. Su, R. S. Liao, and T. Y. Lin, “A multi-stage recognition system to detect different types of abnormality in capsule endoscope images,” J. of Medical and Biological Engineering, vol. 29, no. 3, pp. 114-121, 2009.
5.T. Ojala, M. Pietikäinen, and D. Harwood, “A comparative study of texture measures with classification based on feature distributions,” Pattern Recognition, vol. 29, no. 1, pp. 51-59, 1996.
6.S. Zhao, Y. Gao, and B. Zhang, “Sobel-LBP,” in Proc. of the IEEE Int. Conf. on Image Processing, pp. 2144-2147, 2008.
7.L. A. Alexandre, N. Nobre, and J. Casteleiro, “Color and position versus texture features for endoscopic polyp detection,” in Proc. of the 2008 Int. Conf. on Biomedical Engineering and Informatics, pp. 38-42, May 2008.
8.I. K. Timotius, “Abnormality detection for capsule endoscope images based on support vector machines,” Master thesis, Dept. of Electronic Eng., Chung Yuan Christian Univ., Taiwan, Jan. 2009.
9.X. Tan and B. Triggs, “Enhanced local texture feature sets for face recognition under difficult lighting conditions,” in Proc. of the Int. Workshop on Analysis and Modeling of Faces and Gestures, pp. 168-182, 2007.
10.M. A. Savelonas, D. K. Iakovidis, and D. E. Maroulis, “An LBP-based active contour algorithm for unsupervised texture segmentation,” in Proc. of the 18th Int. Conf. on Pattern Recognition, vol. 2, pp. 279-282, 2006.
11.V. Vapnik, Statistical Learning Theory, Springer, Berlin, Hiedelberg, New York, 1998.
12.I. K. Timotius, S. G. Miaou, and Y. H. Liu, “Abnormality detection for capsule endoscope images based on color histogram and support vector machines,” in Proc. of the 21st Conference on Computer Vision, Graphics, and Image Processing, Yilan, Taiwan, 2008.
13.P. H. Chen, C. J. Lin, and B. Schölkopf, “A tutorial on ν-support vector machines,” Applied Stochastic Models in Business and Industry, vol. 21, pp. 111-136, 2005.
14.A. Takeda and M. Sugiyama, “ν-support vector machine as conditional value-at-risk minimization,” in Proc. of the 25th Int. Conf. on Machine Learning, Helsinki, Finland, 2008.
15.L. Zhuang and H. Dai, “Parameter optimization of kernel-based one-class classifier on imbalance learning,” J. of Computers, vol. 1, no. 7, pp. 32-40, Oct./Nov. 2006.
16.L. M. Menevitz and M. Yousef, “One-class SVMs for document classification,” J. of Machine Learning Research, vol. 2, pp. 139-154, 2001.
17.S. K. Strak, “Video capsule endoscopy,” Basrah J. of Surgery, March 2006.
18.http://en.wikipedia.org/wiki/Sobel_operator.
19.R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, Wiley, New York, 2001.
20.http://en.wikipedia.org/wiki/K-nearest_neighbor_algorithm.
21. S. Haykin, Neural Network: A Comprehensive Foundation, Prentice-Hall, New Jersey, 1999.
22.N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and other Kernel-based Learning Methods, Cambridge Univ. Press, UK, 2000.
23.R. Fletcher, Practical Methods of Optimization, New York, Wiley, 1987.
24.C. C. Chang and C. J. Lin, LIBSVM: A Library for Support Vector Machines, 2001.
25.C. H. Lin, J. C. Liu, and C. H. Ho, “Anomaly detection using LibSVM training tools,” in Proc. of the Int. Conf. on Information Security and Assurance, pp. 166-171, 2008.
26.X. Song, G. Cherian, and G. Fan, “A ν-insensitive approach for compliance monitoring of the conservation reserve program,” IEEE Trans. on Geoscience and Remote Sensing Letters, vol. 2, no. 2, pp. 99-103, April 2005.
27.K. A. Heller, K. M. Svore, A. D. Keromytis, S. J. Stolfo, “One class support vector machines for detecting anomalous windows registry accesses,” in Proc. of the ICDM Workshop on Data Mining for Computer Security (DMSEC), Melbourne, Florida, Nov 2003.
28.http://en.wikipedia.org/wiki/Cross-validation_(statistics).
29.R. Kohavi, “A study of cross-validation and bootstrap for accuracy estimation and model selection,” in Proc. of the 14th Int. Joint Conf. on Artificial Intelligence, pp. 1137-1143, 1995.
30.P. Refaeilzadeh, L. Tang, and H. Liu, “Cross validation,” Encyclopedia of Database Systems, pp. 532-538, 2009.
31.T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning – Data Mining, Inference, and Prediction, CA, USA, 2001.
32.T. Fawcett, “An introduction to ROC analysis,” Pattern Recognition Letters, vol. 27, issue 8, pp. 861-874, June 2006.
33.S. J. Yen, Y. S. Lee, C. H. Lin, and J. C. Ying, “Investigating the effect of sampling methods for imbalanced data distribution,” in Proc. of the IEEE Int. Conf. on Systems, Man, and Cybernetics, vol. 5, pp. 4163-4168, Oct. 2006.
34.Y. Sun, M. S. Kamel, and Y. Wang, “Boosting for learning multiple classes with imbalanced class distribution,” in Proc. of the Sixth Int. Conf. on Data Mining, pp. 592-602, Dec. 2006.
35.P. Kang and S. Cho, “EUS SVMs: ensemble of under-sampled SVMs for data imbalance problems,” in Proc. of the 13th Int. Conf. on Neural Information Processing, Hong Kong, 2006.
36.J. De Mast, “Agreement and Kappa-type indices,” The American Statistician, vol. 61, no. 2, pp. 1-6, May 2007.
37.http://en.wikipedia.org/wiki/Cohen%27s_kappa.
38.J. M. D. Rennie, “Derivation of F-measure,” http://people.csail.mit.edu/jrennie/
writing/fmeasure.pdf.
39.F. Provost and T. Fawcett, “Analysis and visualization of classifier performance: comparison under imprecise class and cost distributions,” in Proc. of the Third Int. Conf. in Knowledge Discovery and Data Mining, pp. 43-48, 1997.
40.M. Kubat and S. Matwin, “Addressing the curse of imbalanced training sets: one-sided selection,” in Proc. of the 14th Int. Conf. on Machine Learning, pp. 179-186, 1997.
41.C. X. Ling, J. Huang, and H. Zhang, “AUC: a statistically consistent and more discriminating measure than accuracy,” in Proc. of the Int. Joint Conf. on Artificial Intelligence, 2003.
42.X. Hong, S. Chen, and C. J. Harris, “A kernel-based two-class classifier for imbalanced data sets,” IEEE Trans. on Neural Networks, vol. 18, no. 1, pp. 28-41, Jan. 2007.
43.S. Hido and H. Kashima, “Roughly balanced bagging for imbalanced data,” in Proc. of the Society for Industrial and Applied Mathematics Int. Conf. on Data Mining, pp. 143-152, 2008.
44.J. Cohen, “A coefficient of agreement for nominal scales,” Educational and Psychological Measurement, vol. 20, no. 1, pp. 37-46, 1960.
45.http://www.mlahanas.de/MOEA/Med/ROC21.htm.
46.D. M. J. Tax and R. P. W. Duin, “Support vector data description,” Machine Learning, vol. 54, issue 1, pp. 45-66, 2004.
47.A. Ben-David, “What’s wrong with hit ratio?,” IEEE Intelligent Systems, vol. 21, issue 6, pp. 68-70, Nov.-Dec. 2006.
48.Y. H. Liu and Y. T. Chen, “Face recognition using total margin-based adaptive fuzzy support vector machines,” IEEE Trans. on Neural Networks, vol. 18, issue 1, pp. 178-192, 2007.
49.J. Mercer, “Functions of positive and negative type, and their connection with the theory of integral equations,” Trans. of the London Philosophical Society (A), vol. 209, pp. 415-416, 1909.
50.R. Courant and D. Hilbert, Methods of Mathematical Physics, vol. I and II, New York, Wiley, 1970.

電子全文 電子全文(本篇電子全文限研究生所屬學校校內系統及IP範圍內開放)
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top