跳到主要內容

臺灣博碩士論文加值系統

(44.222.104.206) 您好!臺灣時間:2024/05/23 16:52
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:陳致穎
研究生(外文):Chen, Zhi-Ying
論文名稱:以遠端控制機器手臂結合應用程式進行貨品條碼辨識與盤點
論文名稱(外文):The remote control robotic arm with mobile application to barcode identification and inventory of goods
指導教授:陳錦泰陳錦泰引用關係
指導教授(外文):Chen, Chin-Tai
口試委員:陳錦泰康耀鴻宋開泰
口試委員(外文):Chen, Chin-TaiKang, Yaw-HongSong, Kai-Tai
口試日期:2017-07-24
學位類別:碩士
校院名稱:國立高雄應用科技大學
系所名稱:機械工程系
學門:工程學門
學類:機械工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:中文
論文頁數:149
中文關鍵詞:機器手臂條碼辨識夾爪夾持力
外文關鍵詞:Robotics armBarcode identificationGripperClamping force
相關次數:
  • 被引用被引用:1
  • 點閱點閱:332
  • 評分評分:
  • 下載下載:7
  • 收藏至我的研究室書目清單書目收藏:0
今日各種終端零售商林立,每一間商店販賣的東西越來越多,隨著消費者不斷的進出商店,使得商品流通快速,使用人力的方式進行商品管理面臨到的挑戰越來越多。倘若能將自動化概念導入現今終端零售商可望解決商店人員無法及時得知商品缺貨與錯置之問題。因此本研究自製6軸機械手臂,並結合現今流行嵌入式電腦-Raspberry Pi,其能搭配行動裝置應用程式(APP)以藍芽無線遠端控制的方式,進行機器手臂的操作。並且透過電腦視覺辨識來執行貨品條碼辨識與盤點。本研究所設計的系統可以辨識常見於商店中的一維條碼(EAN碼)。此外,我們設計一款應用程式,能藉由手持行動裝置進行機器手臂狀態查看與操作;本機器手臂之夾爪具備力量感測功能以便即時調整力量。其夾取物件時,可依照不同物件的材質與軟硬度,給予不同的夾持力,以避免物件變形或受損。整體系統能在嵌入式電腦上執行,解決以往使用一般個人電腦體積、重量與耗電量的問題。最後,進行此機器手臂移動測試與夾爪夾持力測試。由實驗結果中得知,機器手臂三軸座標平均誤差X軸為2.51 %、Y軸為0.15 %與Z軸為0.28 %;在夾取物件後X軸平均誤差上升0.05 %、Y軸上升0.01 %,Z軸則維持不變。夾爪本身能夾取的最大重量為250 gw。但安裝於機器手臂上,最大只能抓取70 gw的物品。
Nowadays, a variety of terminal retailers are around us everywhere. Every store is selling more and more goods, as consumers enjoy shopping at stores. With the rapid flow of goods, manual management and control of merchandise is confronted with more challenges. If automation can be applied for the retailers, it is expected to solve the manual problems such as goods out of stock and being misplaced that staffs are usually unware of. In this study, we made a 6-axis robotic arm embedded with a tiny computer Raspberry Pi, which is widely used for remote control robotic arm via Bluetooth to operate the robot arm. And we used computer vision for barcode identification and inventory of goods. The system designed in this study can identify the common one-dimensional bar codes (EAN code) of goods in stores. In addition, we designed a mobile application (APP) program, which can monitor and operate the robotic arm via the mobile devices. The gripper of this robotic arm had a function of force sensing for real-time control of clamping force. With the goods packaged by different material and rigidity, the gripper can be set to give different clamping force, in order to avoid damage or deformation of them. Finally, through proper mechanical design, we made the force of the gripper uniform applied to the sensor, in order to avert the problem about inaccurate measurement of the force. The overall system can be implemented in an embedded computer to avoid using a personal computer with large volume, weight, and power consumption. Finally, some tests were performed for robotic arm movement and clamping force of gripper. From the experimental results of motion, we found that the average errors of X, Y, and Z-axis were 2.51 %, 0.15 %, and 0.28 %, respectively. With gripping, the average errors increased by 0.05 % in X-axis, 0.01 % in Y-axis, but remained the same in Z-axis. The gripper picked up a maximum weight of 250 gw, while a maximum of 70 gw was affordable as it being installed on the arm.
中文摘要 i
ABSTRACT ii
誌謝 iii
目錄 iv
表目錄 vii
圖目錄 viii
第1章 緒論 1
1.1 研究背景 1
1.2 研究動機與目的 2
第2章 文獻回顧 3
2.1 貨品管理 3
2.2 機器手臂 5
2.3 機器手臂夾爪 7
2.4 視覺辨識 10
2.5 結論 20
第3章 控制方法及機器手臂設計 23
3.1 整體零件 23
3.2 機器手臂設計 25
3.3 夾爪設計 36
3.3.1 力量感測器 40
3.4 運動分析 41
3.4.1 順向運動學 43
3.4.2 逆向運動學 47
3.5 條碼 53
3.5.1 條碼類別 53
3.5.2 條碼結構 54
3.5.2.1 UPC碼 55
3.5.2.2 EAN碼 57
3.5.3 條碼編碼 58
3.5.3.1 條碼編碼 58
3.5.3.2 條碼檢查碼 61
3.5.4 條碼尺寸 61
3.5.5 條碼讀取方式 62
3.6 視覺辨識 62
3.6.1 二值化 67
3.6.2 形狀辨識 69
3.6.3 SURF特徵演算法 70
3.7 控制元件與系統設計 71
3.7.1 Arduino Nano單晶片控制板 71
3.7.2 Raspberry Pi嵌入式電腦 72
3.7.3 10吋觸控式螢幕 73
3.7.4 系統設計 73
3.8 實驗環境設計 75
3.8.1 力量感測器線性度量測 75
3.8.2 夾爪夾持力測試 76
3.8.3 不同顏色條碼測試 77
3.8.4 機器手臂貨品盤點測試 78
第4章 結果與討論 79
4.1 機器手臂與夾爪組裝 79
4.2 機器手臂控制 85
4.2.1 機器手臂各軸角度範圍 85
4.2.2 機器手臂姿態模擬 90
4.2.3 機器手臂正逆向運動學驗證 92
4.2.4 機器手臂工作區域 102
4.3 夾持力測試 103
4.3.1 力量感測器壓力-電阻(電導) 103
4.3.2 夾爪夾持力測試 104
4.4 視覺辨識 105
4.4.1 條碼辨識 106
4.4.2 商品特徵比對 110
4.5 使用者圖形化介面 112
4.6 行動裝置App 115
4.7 整合嵌入式電腦 117
4.8 機器手臂貨品盤點測試 119
第5章 結論與未來展望 124
5.1 結論 124
5.2 未來展望 127
參考文獻 128
[1]D. Corsten, and T. Gruen, “Desperately Seeking Shelf Availability: An Examination of the Extent, the Causes, and the Efforts to Address Retail out‐of‐Stocks,” International Journal of Retail & Distribution Management, vol. 31, pp. 605-617, 2003.
[2]N. Kejriwal et al., “Product Counting Using Images with Application to Robot-Based Retail Stock Assessment,” in 2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA), 2015, pp. 1-6.
[3]T. W. Gruen, and D. Corsten, “A Comprehensive Guide to Retail out-of-Stock Reduction in the Fast-Moving Consumer Goods Industry,” Grocery Manufacturers of America, 2007.
[4]K. Kamei et al., “Recommendation from Robots in a Real-World Retail Shop,” in International Conference on Multimodal Interfaces, 2010, pp. 1.
[5]H. Kelly. "Why Is This Robot in the Grocery Store?," CNNMoney, 2015; http://money.cnn.com/2015/11/10/technology/robot-tally-store/.
[6]M. McFarland. "Why This Robot May Soon Be Cruising Down Your Grocery Store's Aisles," The Washington Post, 2015; https://www.washingtonpost.com/news/innovations/wp/2015/11/10/why-this-robot-may-soon-be-cruising-down-your-grocery-stores-aisles/.
[7]J. Zhang et al., "Mobile Robot for Retail Inventory Using Rfid," 2016 IEEE International Conference on Industrial Technology (ICIT). pp. 101-106.
[8]W. G. Hao et al., “6-Dof Pc-Based Robotic Arm (Pc-Roboarm) with Efficient Trajectory Planning and Speed Control,” pp. 1-7, 2011.
[9]鍾哲瑋,“多軸機械手臂平台之機構設計與電腦視覺回授控制”,國立臺灣海洋大學,電機工程學系,碩士論文,2013。
[10]H. Y. Hsu et al., "Control Design and Implementation of Intelligent Vehicle with Robot Arm and Computer Vision," 2015 International Conference on Advanced Robotics and Intelligent Systems (ARIS). pp. 1-6.
[11]R. S. Hartenberg, and J. Denavit, “A Kinematic Notation for Lower Pair Mechanisms Based on Matrices,” Journal of applied mechanics, vol. 77, pp. 215-221, 1955.
[12]H. A. R. Akkar, and A. N. A-Amir, “Kinematics Analysis and Modeling of 6 Degree of Freedom Robotic Arm from Dfrobot on Labview,” Research Journal of Applied Sciences, Engineering and Technology, vol. 13, pp. 569-575, 2016.
[13]A. M. G. Fuster, “Gripper Design and Development for a Modular Robot Annamariagil-Report,” Electrical Engineering, Technical University of Denmark, 2015.
[14]H. Choi, and M. Koç, “Design and Feasibility Tests of a Flexible Gripper Based on Inflatable Rubber Pockets,” International Journal of Machine Tools and Manufacture, vol. 46, pp. 1350-1361, 2006.
[15]V. Galabov et al., “Synthesis of an Adaptive Gripper,” Applied Mathematical Modelling, vol. 38, pp. 3175-3181, 2014.
[16]M. Russo et al., “Design and Test of a Gripper Prototype for Horticulture Products,” Robotics and Computer-Integrated Manufacturing, vol. 44, pp. 266-275, 2017.
[17]"Machine Vision Introduction," Sick, 2006; https://goo.gl/oMQxWv.
[18]B. G. Batchelor, “Machine Vision Handbook,” Springer-Verlag London, 2012.
[19]J. Seo et al., “Computer Vision Techniques for Construction Safety and Health Monitoring,” Advanced Engineering Informatics, vol. 29, pp. 239-251, 2015.
[20]C. Zheng et al., “A Robust and Automatic Recognition System of Analog Instruments in Power System by Using Computer Vision,” Measurement, vol. 92, pp. 413-420, 2016.
[21]J. Chi et al., “Machine Vision Based Automatic Detection Method of Indicating Values of a Pointer Gauge,” Mathematical Problems in Engineering, vol. 2015, pp. 1-19, 2015.
[22]M. J. Tsai et al., “Machine Vision Based Path Planning for a Robotic Golf Club Head Welding System,” Robotics and Computer-Integrated Manufacturing, vol. 27, pp. 843-849, 2011.
[23]P. Nerakae et al., “Using Machine Vision for Flexible Automatic Assembly System,” Procedia Computer Science, vol. 96, pp. 428-435, 2016.
[24]S. J. Hosseininia et al., “Flexible Automation in Porcelain Edge Polishing Using Machine Vision,” Procedia Technology, vol. 22, pp. 562-569, 2016.
[25]L. Brem, and N. Nandhakumar, “A Machine Vision System for Enhancing the Teleoperation of an Industrial Robot,” Machine Vision and Applications, vol. 7, pp. 187-198, 1994.
[26]G. L. Peng et al., “Computer Vision Algorithm for Measurement and Inspection of O-Rings,” Measurement, vol. 94, pp. 828-836, Dec, 2016.
[27]R. Laguna et al., “Traffic Sign Recognition Application Based on Image Processing Techniques,” IFAC Proceedings Volumes, vol. 47, pp. 104-109, 2014.
[28]A. Ruta et al., “Real-Time Traffic Sign Recognition from Video by Class-Specific Discriminative Features,” Pattern Recognition, vol. 43, pp. 416-430, 2010.
[29]Y. Zhu et al., “Traffic Sign Detection and Recognition Using Fully Convolutional Network Guided Proposals,” Neurocomputing, vol. 214, pp. 758-766, 2016.
[30]X. Wang et al., “Vision-Based Two-Step Brake Detection Method for Vehicle Collision Avoidance,” Neurocomputing, vol. 173, pp. 450-461, 2016.
[31]H. Wang et al., “Smart Road Vehicle Sensing System Based on Monocular Vision,” Optik, vol. 126, pp. 386-390, 2015.
[32]T. Sayed et al., “Automated Safety Diagnosis of Vehicle–Bicycle Interactions Using Computer Vision Analysis,” Safety Science, vol. 59, pp. 163-172, 2013.
[33]S. Azam, and M. M. Islam, “Automatic License Plate Detection in Hazardous Condition,” Journal of Visual Communication and Image Representation, vol. 36, pp. 172-186, 2016.
[34]“中華電信研究院 - 智慧型停車場管理”,http://www.chttl.com.tw/web/ch/service/service_1_05.html。
[35]Y. M. Tian et al., “An Algorithm Combined with Color Differential Models for License-Plate Location,” Neurocomputing, vol. 212, pp. 22-35, Nov 5, 2016.
[36]Y. Si et al., “Location of Apples in Trees Using Stereoscopic Vision,” Computers and Electronics in Agriculture, vol. 112, pp. 68-74, 2015.
[37]Y. N. Duan et al., “An Automatic Counting System for Transparent Pelagic Fish Eggs Based on Computer Vision,” Aquacultural Engineering, vol. 67, pp. 8-13, Jul, 2015.
[38]M. M. Sofu et al., “Design of an Automatic Apple Sorting System Using Machine Vision,” Computers and Electronics in Agriculture, vol. 127, pp. 395-405, 2016.
[39]A. Vinay et al., “Two Dimensionality Reduction Techniques for Surf Based Face Recognition,” Procedia Computer Science, vol. 85, pp. 241-248, 2016.
[40]“行政院生產力4.0 發展方案 - 行政院科技會報”,行政院科技會報,2015,http://www.bost.ey.gov.tw/DL.ashx?u=/Upload/UserFiles/%E8%A1%8C%E6%94%BF%E9%99%A2%E7%94%9F%E7%94%A2%E5%8A%9B4_0%E7%99%BC%E5%B1%95%E6%96%B9%E6%A1%88(2).pdf。
[41]"Welcome to Python.Org," https://www.python.org/.
[42]"Wxwidgets: Cross-Platform Gui Library," https://www.wxwidgets.org/.
[43]T. Robotics. "Dynamixel Ax-12a Robot Actuator," http://www.trossenrobotics.com/dynamixel-ax-12-robot-actuator.aspx.
[44]TowerPro. "Mg996r Robot Servo Datasheet," http://www.towerpro.com.tw/product/mg995-robot-servo-180-rotation/.
[45]C. Blanes et al., “Tactile Sensing with Accelerometers in Prehensile Grippers for Robots,” Mechatronics, vol. 33, pp. 1-12, 2016.
[46]Tekscan. "Flexiforce A301 Force Sensor Datasheet," https://www.tekscan.com/products-solutions/force-sensors/a301.
[47]呂聰池,“Autocad介面平台於機械手臂之動態變速度避障路徑規劃與應用”,臺北科技大學,機電整合研究所,碩士論文,2005。
[48]徐士益,“Autocad介面平台於機械手臂避障路徑之規劃與應用”,臺北科技大學,機電整合研究所,碩士論文,2001。
[49]M. MAN. "All About 2d Bar Codes," SocketMobile, 2007; https://www.socketmobile.com/docs/default-source/white-papers/techbrief_2d-bar-code.pdf?sfvrsn=2.
[50]M. G. Hanna, and L. Pantanowitz, “Bar Coding and Tracking in Pathology,” Surg Pathol Clin, vol. 8, pp. 123-135, Jun, 2015.
[51]"全球條碼號碼登記手冊," 中華民國商品條碼策進會, 2011; https://www.gs1tw.org/twct/gs1w/download/2011_BarCode_brochure.pdf.
[52]林伯諭, "以影像為基礎之即時條碼掃描系統," 國立中正大學, 電機工程所, 碩士論文, 2007.
[53]J. C. Rocholl, “Robust 1d Barcode Recognition on Mobile Devices,” Department of Intelligent Systems, University of Stuttgart, 2009.
[54]"Dion Label Printing Barcode Information," Dion Label Printing, 2012; http://www.dionlabel.com/tl_files/dion/Downloads/Dion%20Label%20Printing%20Barcode%20Information.pdf.
[55]"Ean-13 Barcode Fact Sheet," GS1 Australia, 2017; https://www.gs1au.org/WorkArea/DownloadAsset.aspx?id=2147486772.
[56]"Barcoding - Getting It Right," GS1 UK, 2015; http://www.axicon.com/assets/how_to_barcoding_getting_it_right.pdf.
[57]logitech. "C920r Hd Pro Web Camera Datasheet," https://www.logitech.com/zh-tw/product/hd-pro-webcam-c920.
[58]N. Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 9, pp. 62-66, 1979.
[59]R.-X. Liu et al., “Improved Canny Algorithm for Edge Detection of Core Image,” The Open Automation and Control Systems Journal, vol. 6, pp. 426-432, 2015.
[60]J. Canny, “A Computational Approach to Edge Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-8, pp. 679-698, 1986.
[61]H. Bay et al., “Speeded-up Robust Features (Surf),” Computer Vision and Image Understanding, vol. 110, pp. 346-359, 2008.
[62]"Arduino Nano," https://store.arduino.cc/usa/arduino-nano.
[63]"Raspberry Pi 3 Model B - Raspberry Pi," https://www.raspberrypi.org/products/raspberry-pi-3-model-b/.
[64]"10.1inch Hdmi Lcd (B) (with Case)," http://www.waveshare.com/wiki/10.1inch_HDMI_LCD_(B)_(with_case).

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top