(3.235.108.188) 您好!臺灣時間:2021/02/27 03:52
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:徐傳源
研究生(外文):JOSEPH SII TUONG GUAN
論文名稱:應用於眼球機器人之智慧型多軸追蹤控制
論文名稱(外文):Intelligent Multiaxial Tracking Control Applied To the Eye-Robot
指導教授:陳永平陳永平引用關係
指導教授(外文):Chen, Yon-Ping
學位類別:碩士
校院名稱:國立交通大學
系所名稱:電控工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2010
畢業學年度:98
語文別:英文
論文頁數:57
中文關鍵詞:機器人眼球追蹤影像處理目標物追蹤
外文關鍵詞:Eye-robot trackingImage processingObject tracking
相關次數:
  • 被引用被引用:0
  • 點閱點閱:338
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:84
  • 收藏至我的研究室書目清單書目收藏:0
為了模仿人類雙眼視覺的追蹤,本篇論文針對目標物的追蹤控制設計了結合類神經網路的雙眼機器人。整個追蹤控制的策略可分為四個部分,首先是由雙眼機器人利用攝影機擷取移動目標的影像,並且判別它的位置所在。其次是設計目標位置中心和追蹤速度的關係當作脖子和眼睛的兩種訓練模式。接著再利用倒傳遞演算學習法則的離線學習(off-line training)來設計四種類神經網路,學習訓練模式所描述的目標位置中心對應速度之間的關係。最後,雙眼機器人採用了最佳的訓練結果做為它的控制器來追蹤移動的目標。這些智慧型的追蹤控制器經由實驗證明確實可成功地完成物體的定點追蹤控制和水平追蹤控制。
To emulate the humanoid binocular vision tracking, this thesis presents the tracking controller design based on the neural network for the Eye-robot to trace a moving object. The tracking control strategy is partitioned into four parts. First, the Eye-robot retrieves the image of the moving object from two cameras and identifies its position. Second, two training patterns concerning the relationship between the tracking velocity and object position are designed for the neck and two eyes to learn. Third, under off-line training four neural networks are constructed and trained by the back-propagation algorithm to learn the training patterns. Finally, the Eye-robot adopts the well trained neural networks as its controller to trace the moving object. The success of the tracking control of the Eye-robot can be concluded from the experiment results of the set-point control and the horizontal trajectory tracking control.
Chinese Abstract i
English Abstract ii
Acknowledgement iii
Contents iv
List of Figures vi
List of Tables ix

Chapter 1 Introduction 1
1.1 Motivation 1
1.2 Related Work 2
1.3 Thesis Organization 5
Chapter 2 Intelligent Learning Algorithm 6
2.1 Introduction to ANN 6
2.2 Back-Propagation Network 10
Chapter 3 System Description 16
3.1 Problem Statement 16
3.2 Hardware 18
3.3 Software 20
Chapter 4 Intelligent Object Tracking Controller Design 22
4.1 Object Detection 22
4.2 Tracking Control Design 27
Chapter 5 Experimental Results 36
5.1 Neural Network Off-line Training 36
5.2 Set Point Control 39
5.3 Horizontal Object Tracking Control 45
Chapter 6 Conclusions and Future Work 54
Reference 55


[1] Jean-Luc Starck, Fionn Murtagh, Emmanuel J. Candes, and David L. Donoho., (2003), ”Gray and Color Image Contrast Enhancement by the Curve let Transform.”IEEE TRANSACTIONS ON IMAGE PROCESSING. Vol. 12, No 6, May.
[2] Kim, S., I. Kim, and I. Kweon, (2003), “Robust model-based 3d object recognitionby combining feature matching with tracking.” Proceedings of the IEEE International Conference on Robotics and Automation, ICRA., 2: pp.2123–2128.
[3] A. Arsenioa and J. Santos-Victor, “Robust Visual Tracking by an Active Observer,” IEEE IROS 97, Vol 3, pp.1342-1347, 1997.
[4] A.M. Baumberg, and D.C. Hogg, “An Efficient Method for Contour Tracking using Active Shape Models,” Motion of Non-Rigid and Articulated Objects, Proceedings of the IEEE Workshop , pp.194-199, Nov. 1994.
[5] D. H. Nguyen and B. Widrow, "Neural networks for self-learning control systems, " IEEE Control Systems Magazine, pp. 18-23, Apr. 1990.
[6] A. Blanco, M. Delgado and M. C. Pegalajar, “A real-coded genetic algorithm for training recurrent neural networks,” Neural Networks, Vol. 14, No. 1, pp. 93-105, 2001.
[7] K. Nishihara and T. Poggio, “Stereo vision for robotics,” Proc. Robotics Research, First Int. Symp. , pp. 489–505, 1984.
[8] M. Kabuca, J. Desoto and J. Miranda, “Robot Vision Tracking System, ” IEEE Trans. On Industrial Electronics, Vol. 35, No. 1, February 1988, pp. 40-51.
[9] Brian Scassellati. A binocular, foveated active vision system. Memo1628, Massachusetts Institute of Technology Artificial Intelligence Lab, Cambridge, Massachusetts, January 1998.
[10] McKenna, S. J., Raja, Y. and Gong, S. (1999), “Tracking Colour Objects using Adaptive Mixture Models”, Image and Vision Computing, vol. 17.
[11] Daniilidis, K., Krauss, C., Hansen, M. and Sommer, G. (1997), “Real Time Tracking of Moving Objects with an Active Camera”, Real Time Imaging no. 1 February 1998.
[12] Bruce, J., Balch, T. and Veloso, M. (2000), “Fast and Inexpensive Color Image Segmentation for Interactive Robots”, Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol. 3.
[13] J. Hertz, A. Krogh, and R. G. Palmer, Introduction to the Theory of Neural Computation. Reading, MA: Addison-Wesley, 1991.
[14] W.T. Miller III, R.S. Sutton, and P.J. Werbos, Eds., Neural Networks for Control. Cambridge, MA: MIT Press, 1990.
[15] Sheng-Wen Shih and Jin Liu. A novel approach to 3-D gaze tracking using stereo cameras. In IEEE Transactions on Systems, Man and Cybernetics, Part B, Nantou, Taiwan, February 2004.
[16] Nischal M. Piratla and Anura P. Jayasumana. A neural network based real-time gaze tracker. J. Netw. Comput. Appl., 25(3):179–196, 2002.
[17] Wyatt, H. J. and Pola, J. (1987) Smooth eye movements with step ramp stimuli: the influence of attention and stimulus extent. Vision Res. 27(9), 1565-1580.
[18] Worfolk, R. and Barnes, G. R. (1992) Interaction of active and passive slow eye movement systems. Expl Brain Res. 90, 589-598.
[19] J. Pelz, R. Canosa, J. Babcock, D. Kucharczyk, A. Silver, and D. Konno. Portable eyetracking: A study of natural eye movements, 2000.
[20] M. Sonka, V. Hlavac, and R. Boyle. Image Processing, Analysis and Machine Vision. International Thomson Publishing, 2 edition, 1998.
[21] Zhiwei Zhu, Kikuo Fujimura, and Qiang Ji. Real-time eye detection and tracking under various light conditions. In Proceedings of the symposium on Eye tracking research & applications, pages 139_144. ACM Press, 2002.
[22] X. Xie, R. Sudhakar, and H. Zhuang. A cascaded scheme for eye tracking and head movement compensation. T-SMC, A28:487_490, 1998.
[23] Starck , L .Jean, Fionn Murtagh, J. Candes, Emmanuel, and L. Donoho, David, (2003), ”Gray and Color Image Contrast Enhancement by the Curve let Transform.” IEEE TRANSACTIONS ON IMAGE PROCESSING,12(6) pp.706-717.
[24] Jean-Luc Starck, Fionn Murtagh, Emmanuel J. Candes, and David L. Donoho, (2003), ”Gray and Color Image Contrast Enhancement by the Curve let Transform.” IEEE TRANSACTIONS ON IMAGE PROCESSING. Vol. 12, No 6, May.
[25] Black, J., T.Ellis, (2006), “Multi camera image tracking.” Proceedings of the Elsevier International Conference on Image and Vision Computing., 24(11):1256–1267.

連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔