跳到主要內容

臺灣博碩士論文加值系統

(44.200.86.95) 您好!臺灣時間:2024/05/28 08:25
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:蔡伯威
研究生(外文):Bo-Wei Cai
論文名稱:以模糊推論做為社交機器人情感行為產生機制之研究
論文名稱(外文):A Study of the Emotional Behavior Generated by Fuzzy Inference for a Social Robot
指導教授:季永炤季永炤引用關係
學位類別:碩士
校院名稱:國立虎尾科技大學
系所名稱:自動化工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2014
畢業學年度:102
語文別:中文
論文頁數:57
中文關鍵詞:情感模型模糊推論行為控制社交機器人感測融合
外文關鍵詞:Social RobotEmotion modelFuzzy inferenceBehavior-basedSensing fusion
相關次數:
  • 被引用被引用:2
  • 點閱點閱:793
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:1
本研究之目的為探討以模糊推論做為對一具有情感表達特性之機器人其情感模型之演算機制,並依此建立一能與環境互動並而能產生成適當情感行為之社交機器人。此機器人具備有差動式雙輪運動機構,其頭部裝設有小型液晶顯示器可以以動畫模擬產生人物臉部之表情,其可運行於一般桌面上與使用者及周邊環境互動。機體所裝設之感測器,包含小型攝相機、紅外線距離感測裝置、與加速度計等,用於偵測外界環境變化與使用者之特定動作。
本文之情感機器人其情感行為推論系統由感知層、心理層、與行為層三部分所構成,機器人對於外界之輸入資訊先進行感知層處理以獲得資訊之特徵,外界資訊則係藉由機器人所裝備之各項感測器來獲取周遭環境參數資訊與具特定性之輸入,後再藉由影像處理、感測融合等處理程序取得感環境感知特徵,特徵將用於各式情感行為之輸出並會影響心理層中依二階動態情感方程式所建立之內部情感狀態。心理層由二階動態情感方程式與情感行為推論機制結合而成,其透過模糊邏輯推論總和合成二者而合成產生情緒行為,同時以模糊演算法將情感動態之輸出平順化,使機器人能夠生成多樣且細膩之情感行為,而使其情感之表達更為接近一般人物之特性。行為之生成與輸出則是在行為層以行為式架構之方式依感知特徵與情感狀態由特定之協調機制對行為輸出進行仲裁而決定,此機制在於整合機器人各個行為輸出之動作反應,以實現合理之行為輸出。
本研究最後以實體機器人進行整體性能之測試,驗證作為一社交機器人其情感互動能力。其中軟體平台部分則是以LabVIEW®進行感測處理與演算推論其情感模型結果,並作為即時監控與狀態回饋之用。


This paper presents a fuzzy based computational algorithm used in the development of an artificial emotion model for the social robot that can deduce the appropriate emotional behavior by interactions with its environment.
The robot is a two-wheel differential drive mobile robot that can interaction with the user and the surrounding environment on the desktop and generate emotional behavior. It has a color liquid crystal display on the head for the facial animation, and equipped with webcam, infrared distance sensors, and accelerometer for detecting changes of the external environment and user’s actions. The artificial emotion inference system of the emotional robot is composed of three functional layers; sense/perception layer, mental layer, and behavior layer. In sense/perception layer, the robot receives inputs from the environment by various types of sensors. The received data will be further processed to get features information. The ambient features are gathered through the fusion of the IR sensor and image processing. Together with other features, they will be used for the emotion inference. Some feature can also generate the direct response of the system. In mental layer, the artificial emotion inference system of the emotional robot generates emotional behavior by fuzzy logic inference which combines the results of a second-order state equation and the output of the OCC based emotional behavior inference mechanism. The robot can generate supple, diverse and delicate emotional behavior by fuzzy algorithm. In behavior layer, the Subsumption structure is used in the behavior generating and producing proper output actions of the robot system. This mechanism integrates action parameters of behaviors to achieve a reasonable reaction of the social robot.
The established emotional behavior inference system is tested with a designed mobile robot system which uses the LabVIEW® programmed data fusion and emotion inference engine. The behavior actions, however, are carried out by the single chip processer on the robot itself. The results show the effectiveness of the proposed inference algorithm.


摘要...................................i
Abstract...................................ii
誌謝...................................iii
目錄...................................iv
表目錄...................................vi
圖目錄...................................vii
第一章 簡介...................................1
1.1 研究動機...................................3
1.2 研究目的與方法...................................4
1.3 文獻回顧...................................4
1.4 論文架構...................................6
第二章 情感模型...................................7
2.1 評估分類理論...................................7
2.2 維度狀態理論...................................9
第三章 情感機器人之系統架構.....................2
3.1 系統主架構.....................12
3.2 結構與動作產生機構.....................14
3.2.1 機體設計.....................14
3.2.2 頭部設計.....................16
第四章 感知處理與情感推論.....................18
4.1 感知處理.....................18
4.1.1 加速度訊號處理.....................19
4.1.2 距離感測處理.....................20
4.1.3 影像處理.....................21
4.2 情感模型設計.....................24
4.2.1 情感行為推論模組.....................25
4.2.2 動態情感平面模組.....................28
4.2.3 模糊推論模組.....................30
第五章 機器人行為架構.....................37
5.1 行為層系統架構.....................37
5.2 行為輸出設計.....................37
5.2.1 臉部行為設計.....................37
5.2.2 身體動作設計.....................41
5.3 行為仲裁機制.....................42
第六章 系統實測與結果.....................46
6.1 測試方法.....................46
6.2 測試結果.....................46
第七章 總結.....................53
7.1 結論.....................53
7.2 未來展望.....................53
參考文獻.....................55
Extended Abstract
作者簡歷


[1]戴熒美, "2021年全球機器人市場預測", 機械工程, vol. 365, pp. 28-37, 2013。.
[2]林其禹等, 智慧型機器人-原理與應用, 高立, 2013。
[3]E. Ackerman. Japan Earthquake: iRobot Sending Packbots and Warriors to Fukushima Dai-1 Nuclear Plant. 2013, 11/6. http://spectrum.ieee.org/automaton/robotics/industrial-robots/irobot-sending-packbots-and-warriors-to-fukushima
[4]J. Kastrenakes. Google fine tunes its self-driving car for city streets. 2014, 6/2.
http://www.theverge.com/2014/4/28/5660776/google-self-driving-car-city-testing-improvements
[5]高島屋に大阪の「ええモン」集結-アンドロイド「ミナミちゃん」接客も. 2014, 6/2. http://namba.keizai.biz/headline/photo/2825/
[6]J. Nagata,"適用於超高齡社會的機器人",ROBOCOM,vol. 2,pp. 8-17, 2012。
[7]石承泰,"2010智慧型機器人產業概況" ,智慧型機器人產業情報報告,vol. 55, pp. 19-30, 2011。
[8]林冠廷, "具情感狀態之機器人其行為控制架構之設計", 國立虎尾科技大學機械與機電工程研究所碩士論文, 2010。
[9]R. Brooks, "Achieving Artificial Intelligence through Building Robots," Massachusetts Institute of Technology, 1986.
[10]R. C. Arkin, "The impact of cybernetics on the design of a mobile robot system: A case study," IEEE Transactions on Systems, Man and Cybernetics, vol. 20, pp. 1245-1257, 1990.
[11]R. C. Arkin and R. R. Murphy, "Autonomous navigation in a manufacturing environment," IEEE Transactions on Robotics and Automation, vol. 6, pp. 445-454, 1990.
[12]A. Ortony, G. L. Clore, and A. Collins, The Cognitive Structure of Emotions. New York: Cambridge University Press, 1988.
[13]H. Miwa, K. Itoh, D. Ito, H. Takanobu, and A. Takanishi, "Introduction of the need model for humanoid robots to generate active behavior," IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, pp. 1400-1406, 2003.
[14]H. Miwa, T. Okuchi, K. Itoh, H. Takanobu, and A. Takanishi, "A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion," IEEE International Conference on Robotics and Automation, vol. 3, pp. 3588-3593, 2003.
[15]J. Russell and M. Bullock, "Multidimensional Scaling of Emotional Facial Expressions: Similarity From Preschool to Adults," Personality and Social Psychology, vol. 48, pp. 1290-1298, 1985.
[16]C. Breazeal, "Robot in Society: Friend or Appliance?," Agents99 workshop on emotion-based agent architectures, pp. 18-26, 1999.
[17]C. Breazeal and B. Scassellati, "A context-dependent attention system for a social robot," International Joint Conference on Artificial Intelligence, pp. 1146-1151, 1999.
[18]H. Meng-Ju, L. Chia-How, and S. Kai-Tai, "Robotic Emotional Expression Generation Based on Mood Transition and Personality Model," Cybernetics, IEEE Transactions on, vol. 43, pp. 1290-1303, 2013.
[19]陸洛、高旭繁, 實用心理學, 滄海, 2012。
[20]C. Bartneck, "Integrating the OCC Model of Emotions in Embodied Characters," Workshop on Virtual Conversational Characters: Applications, Methods, and Research Challenges, 2002.
[21]B. Siciliano and O. Khatib, Springer Handbook of Robotics. New York: Springer Berlin Heidelberg, 2008.
[22]M. Mori, " Bukimi no tani (The uncanny valley)," Energy, pp. 33-35, 1970.
[23]A. Sharkey and N. Sharkey, "Children, the Elderly, and Interactive Robots," Robotics & Automation Magazine, IEEE, vol. 18, pp. 32-38, 2011.
[24]S. Marsella, J. Gratch, and P. Petta, "Computational Models of Emotion," A Blueprint for Affective Computing-A sourcebook and manual, pp. 21-46, 2010.
[25](2013, 5/7). MPEG-4 FBA. http://www.visagetechnologies.com/technology/mpeg-4-face-and-body-animation/
[26](2014, 1/7). Grimace Project. http://www.grimace-project.net/
[27]Ekman, Paul, and Wallace V. Friesen. "Measuring facial movement." Environmental psychology and nonverbal behavior, pp. 56-75, 1976.
[28](2013, 6/6). Facial Action Coding System. http://face-and-emotion.com/dataface/facs/description.jsp
[29]R. A. Brooks, "A Robust Layered Control System for a Mobile Robot," IEEE Journal of Robotics and Automation, vol. 2, pp. 14-23, 1985.


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top