跳到主要內容

臺灣博碩士論文加值系統

(216.73.216.10) 您好!臺灣時間:2025/09/30 19:32
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:陳永秩
研究生(外文):Yung-Chin Chen
論文名稱:應用均值位移追蹤雙Kinect序列影像測量人體關節角度
論文名稱(外文):Measurement of body joint angles from two Kinect image sequences based on mean shift tracking
指導教授:李錫堅李錫堅引用關係
指導教授(外文):Hsi-Jian Lee
口試委員:范國清林信鋒
口試日期:2015-06-26
學位類別:碩士
校院名稱:慈濟大學
系所名稱:醫學資訊學系碩士班
學門:醫藥衛生學門
學類:醫學技術及檢驗學類
論文種類:學術論文
論文出版年:2015
畢業學年度:103
語文別:英文
論文頁數:56
中文關鍵詞:關節可活動範圍人體關節深度影像
外文關鍵詞:KinectMean shiftDepth imageRange of motion
相關次數:
  • 被引用被引用:1
  • 點閱點閱:342
  • 評分評分:
  • 下載下載:19
  • 收藏至我的研究室書目清單書目收藏:0
物理治療臨床上醫師或治療師會依據可活動範圍來評估病患的關節功能性。由於專業的動作擷取系統造價昂貴、測量耗時,目前臨床上治療師大多使用量尺來測量病患的關節,量尺的測量方式較為主觀也缺乏量化的準確性、可靠性。
在本研究中使用了兩台Kinect的彩色與深度影像來重建出關節的三維座標位置,並計算關節的可活動角度。在實驗環境中,兩台Kinect相距三公尺面對受測者擺放。我們使用Gaussian background model在深度影像中切割出人體的範圍,並透過彩色與深度影像來重建出像素的三維座標。為了能更準確的在整個序列影像裡追蹤到關節的位置,我們套用了mean shift的演算法來找出每張影像的關節中心點。關節的可活動角度以及運動數據會經由序列中每張影像的關節座標位置來計算。為了驗證本研究的結果,我們使用VICON動作擷取系統產生的關節角度數據做為Gold Standard。
300組實驗結果顯示在六種不同的上肢運動中,本研究產生的關節可活動角度與VICON系統的平均誤差介於4~8度之間。這個誤差在物理治療的臨床上是可以被接受的範圍。

Range of motion (ROM) is commonly used to assess a patient’s joint function in physical therapy. Because motion capture systems are generally very expensive, physical therapists mostly use simple rulers to measure patients’ joint angles in clinical diagnosis, which will suffer from low accuracy, low reliability, and subjective.
In this study we used color and depth image feature from two sets of low-cost Microsoft Kinect to reconstruct 3D joint positions, and then calculate moveable joint angles to assess the ROM. A Gaussian background model is first used to segment the human body from the depth images. The 3D coordinates of the joints are reconstructed from both color and depth images. To track the location of joints throughout the sequence more precisely, we adopt the mean shift algorithm to find out the center of voxels upon the joints. The joint moveable angles and the motion data are calculated from the position of joints frame by frame. The two sets of Kinect are placed three meters away from each other and facing to the subject. To verify the results of our system, we take the results from a motion capture system called VICON as golden standard.
Our 300 test results showed that the deviation of joint moveable angles between those obtained by VICON and our system is about 4 to 8 degree in six different upper limb exercises, which are acceptable in clinical environment.

Chapter 1-Introduction----------------------------------1
1.1. Background-----------------------------------------4
1.1.1. Shoulder-----------------------------------------4
1.1.2. Problem Definition-------------------------------5
1.2. Related Work---------------------------------------8
1.2.1. Marker-base system-------------------------------8
1.2.2. Marker-less system-------------------------------9
1.2.3. Motion Capture in Computer Vision----------------11
1.3. Material-------------------------------------------14
1.3.1. Kinect-------------------------------------------14
1.3.2. Motion capture system----------------------------16
1.3.3. Experimental environment-------------------------16
1.4. System Flow----------------------------------------18
Chapter 2- Color Filtering and Background Subtraction---20
2.1. Color Filtering------------------------------------20
2.1.1. HSL color space----------------------------------20
2.2. Background Subtraction-----------------------------22
2.2.1. Gaussian background model------------------------22
2.2.2. Connected component------------------------------24
Chapter 3-Reconstruct Joint Position--------------------26
3.1. Mapping the pixel to voxels------------------------26
3.2. Removing ignorable voxels--------------------------27
3.3. Mean shift tracking--------------------------------31
Chapter 4-Joint Angle Computation and Movement Analysis-34
4.1. Joint position Correction--------------------------34
4.2. Transform the coordinate of Kinect-----------------36
4.3. Joint Angle Computation----------------------------37
4.4. Movement Analysis----------------------------------38
Chapter 5-Experimental Result and Discussion------------40
5.1. Experimental Result--------------------------------40
5.2. Discussion-----------------------------------------49
5.2.1. Occlusion----------------------------------------49
5.2.2. The accuracy of joint position-------------------50
Chapter 6 Conclusion and Future Works-------------------52
6.1. Conclusion-----------------------------------------52
6.2. Future works---------------------------------------53
Chapter 7 References------------------------------------54
[1]E. Croarkin, J. Danoff and C. Barnes, “Evidence-based rating of upper-extremity motor function tests used for people following a stroke.” Physical Therapy 84 (2008), 62-74.
[2]R. Gajdosik and R. Bohannon, “Clinical measurement of range of motion. Review of goniometry emphasizing reliability and validity.” Phys Ther, 67 (1987), 1867-1872.
[3]B. Michael, T. Cuong, M. Mohan, and B.Thomas. “Human pose estimation and cctivity recognition from multi-view videos: comparative explorations of recent developments.” IEEE Journal of Selected Topics in Signal Processing 6, no. 5 (September 2012): 538–52. doi:10.1109/JSTSP.2012.2196975.
[4]E. Ceseracciu, S. Zimi, and C. Claudio. “Comparison of marker-less and marker-based motion capture technologies through simultaneous data collection during gait: proof of concept.” PLoS ONE 9, no. 3 (March 4, 2014): e87640. doi:10.1371/journal.pone.0087640.
[5]G. Kurillo, Jay J. Han, Š. OBDRŽÁLEKa, P. Yan, R. Abresch, A. Nicorici, and R. Bajcsy. “Upper extremity reachable workspace evaluation with Kinect.” Studies in Health Technology and Informatics 184 (2012): 247–53.
[6]J. Han, L. Shao, D. Xu, and J. Shotton. “Enhanced computer vision with Microsoft Kinect sensor: A review.” IEEE Transactions on Cybernetics 43, no. 5 (October 2013): 1318–34. doi:10.1109/TCYB.2013.2265378.
[7]B. Bonnechère, B. Jansen, P. Salvia, H. Bouzahouene, L. Omelina, F. Moiseev, V. Sholukha, J. Cornelis, M. Rooze, and S. Van Sint Jan. “Validity and reliability of the Kinect within functional assessment activities: comparison with standard stereophotogrammetry.” Gait & Posture 39, no. 1 (January 2014): 593–98. doi:10.1016/j.gaitpost.2013.09.018.
[8]M-C. Silaghi, P. Ralf, B. Ronan, F. Pascal, and T. Daniel. “Local and global skeleton fitting techniques for optical motion capture.” In Modelling and Motion Capture Techniques for Virtual Environments, 26–40. Springer, 1998.
[9]S. Ralf, D-K. Catherine, S. Jiri, and R. Günter. “A marker-based measurement procedure for unconstrained wrist and elbow motions.” Journal of Biomechanics 32, no. 6 (1999): 615–21.
[10]G. Varun, C. Plagemann, D. Koller, and S. Thrun. “Real time motion capture using a single time-of-flight camera.” In Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, 755–62. IEEE, 2010. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5540141.
[11]G. Juergen, C. Stoll, E. Aguiar, C. Theobalt, B. Rosenhahn, and H.-P. Seidel. “Motion capture using joint skeleton tracking and surface estimation.” In Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on, 1746–53. IEEE, 2009. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5206755.
[12]T. Christian, M. Marcus A, S. Pascal, and S. Hans-Peter. “Combining 2D feature tracking and volume reconstruction for online video-based human motion capture.” International Journal of Image and Graphics 4, no. 04 (2004): 563–83.
[13]M. Thomas B., H. Adrian, and V. Krüger. “A survey of advances in vision-based human motion capture and analysis.” Computer Vision and Image Understanding 104, no. 2–3 (November 2006): 90–126. doi:10.1016/j.cviu.2006.08.002.
[14]I. Mikic, M. M. Trivedi, E. Hunter, and P. Cosman, “Human bodymodel acquisition and tracking using voxel data,” Int. J. Comput. Vision.,vol. 53, no. 3, pp. 199–223, 2003
[15]C. Stefano, M. Lars, G. Emiliano, F. Giancarlo, and A. Thomas P. “Markerless motion capture through visual hull, articulated ICP and subject specific model generation.” International Journal of Computer Vision 87, no. 1–2 (March 2010): 156–69. doi:10.1007/s11263-009-0284-3.
[16]Z. Yu, W. Chen, and G. Guo. “Evaluating spatiotemporal interest point features for depth-based action recognition.” Image and Vision Computing 32, no. 8 (August 2014): 453–64. doi:10.1016/j.imavis.2014.04.005.
[17]S. Jamie, T. Sharp, A. Kipman, A .Fitzgibbon, M. Finocchio, A. Blake, M. Cook, and R. Moore. “Real-time human pose recognition in parts from single depth images.” Communications of the ACM 56, no. 1 (2013): 116–24.
[18]J. Ziegler, K. Nickel, and R. Stiefelhagen, “Tracking of the articulated upper body on multi-view stereo image sequences,” in Proc. Comput.Vis. Pattern Recognit., 2006
[19]T. Cuong, and T. Mohan M. “Extremity movement observation framework for upper body pose tracking in 3D,” 446–47. IEEE, 2009. doi:10.1109/ISM.2009.89.
[20]Y-J. Chang, Y-H. Wen, and C-T. Yu. “A kinect-based upper limb rehabilitation system to assist people with cerebral palsy.” Research in Developmental Disabilities 34, no. 11 (November 2013): 3654–59. doi:10.1016/j.ridd.2013.08.021.
[21]Š. Obdržálek, G. Kurillo, F. Ofli, R. Bajcsy, E. Seto, H. Jimison and M. Pavel, “Accuracy and robustness of Kinect pose estimation in the context of coaching of elderly population”, in Proceedings of EMBC, 34th International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, 2012.
[22]Fukunaga, Keinosuke, and Larry Hostetler. "The estimation of the gradient of a density function, with applications in pattern recognition." Information Theory, IEEE Transactions on 21.1 (1975): 32-40.
[23]Y. Cheng. “Mean shift, mode seeking, and clustering.” Pattern Analysis and Machine Intelligence, IEEE Transactions on 17, no. 8 (1995): 790–99.
[24]X. Ning, and G. Guo. “Assessing spinal loading using the Kinect depth sensor: A feasibility study.” IEEE Sensors Journal 13, no. 4 (April 2013): 1139–40. doi:10.1109/JSEN.2012.2230252.

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊