跳到主要內容

臺灣博碩士論文加值系統

(216.73.216.108) 您好!臺灣時間:2025/09/02 05:23
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:傅郁翔
研究生(外文):Yu-Hsiang Fu
論文名稱:應用情緒感知於數位機上盒之研究
論文名稱(外文):Study of Emotional Perception on Digital Set-Top Box
指導教授:李仁貴李仁貴引用關係
口試委員:郭天穎林仲志
口試日期:2013-01-08
學位類別:碩士
校院名稱:國立臺北科技大學
系所名稱:電腦與通訊研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2013
畢業學年度:101
語文別:中文
論文頁數:66
中文關鍵詞:數位機上盒心率變異度分析情緒反應
外文關鍵詞:digital Set-Top BoxHeart Rate Variabilityemotional reactions
相關次數:
  • 被引用被引用:0
  • 點閱點閱:310
  • 評分評分:
  • 下載下載:33
  • 收藏至我的研究室書目清單書目收藏:0
本文主要整合數位機上盒與情緒感知介面以做為收視調查的平台,讓數位媒體業者能更直接與快速的了解到觀眾對於媒體內容的反應,以提升收視調查的時效。數位機上盒提供了許多的媒體內容,而媒體內容的表現為媒體業者關心的焦點,因此本文攝取與分析觀眾在觀看影片時的生理信號並進行情緒辨識分類,用以解析觀眾對於媒體內容的情緒反應。
文中先以挑選好的影片來誘發四位觀眾的高興、愉悅、厭惡、恐懼與一般等五種情緒,並攝取觀眾在觀看影片時的心電訊號。而心電訊號經過分析處理、取得特徵與正規化之後,將七種特徵參數透過KNN(K-Nearest Neighbor algorithm)的分類器來進行情緒的分類,以辨識出觀眾觀看影片的情緒狀態。
結果中顯示,利用影片刺激四位觀眾後,利用KNN 演算法將七種特徵值進行情緒的分類,當四位觀眾一起辨識五種情緒狀態時,其辨識率為76.09%。另外,文中也針對單一觀眾進行分析,當以自身情緒資料來分辨情緒,其辨識率最高為45.06%;而以非自身之情緒資料來分辨情緒,其辨識率最高僅為27.59%。由此顯示個別差異性會於情緒辨識有極大影響,需要持續增加觀眾的個數,以降低個別化的差異性。

Nowadays, with digital Set-Top Box, variety of media contents could be provided with ease and the performance of these contents becomes the focus of concern to media industry. Under the circumstances, the aim of this study is to integrate digital Set-Top Box with emotion perception interface as an audience measurement platform to enhance the measurement efficiency and quickly learn the reaction when audience viewing these contents. This study adopts and analyses the biophysical signals when audience watching video to make emotion recognition classification, which could use for recognizing audiences’ emotion reaction to media content.
The use of pre-selected video would induce four audiences'' affective response included laughing, pleasure, disgust, fear, and normal. Meanwhile, Electrocardiogram of audiences through whole process will be recorded as well. After analyzation, acquisition features, and normalization, the seven biophysical signals from Electrocardiogram would be classified by KNN(K-Nearest Neighbor algorithm) classifier to recognize audience’s affective response.
As a result, when recognized by four audience together, the accuracy of using video to induce audiences'' affective response and using KNN to classify seven biophysical signals are 76.09%. Moreover, when recognized by single audience, the accuracy of using self-testing data and using training data are 45.06% and 27.59%.
Hence, individual difference will cause huge effect to the emotion recognition. To reduce deviation and increase accuracy, more experiment data would be extremely necessary.

中文摘要 i
英文摘要 ii
誌謝 iv
目錄 v
表目錄 viii
圖目錄 ix
第一章緒論 1
1.1 研究背景 1
1.2 研究動機與目的 3
1.3 相關文獻 3
1.4 論文架構 9
第二章相關背景知識 11
2.1 自律神經系統 11
2.1.1 交感神經系統 12
2.1.2 副交感神經系統 13
2.2 心電圖相關介紹 13
2.2.1 心臟之傳導方式 13
2.2.2 心電圖之量測方式 15
2.3 心率變異度分析 18
2.3.1 心率變異度之定義 18
2.3.2 心率變異度之量測 18
2.3.3 心率變異度之分析方式 19
2.4 情緒辨識相關介紹 23
2.4.1 情緒的刺激 23
2.4.2 情緒的量測 25
2.4.3 情緒的分類 28
第三章系統設計與分析 30
3.1 系統架構概述 30
3.1.1 情緒資料庫 31
3.2 心電訊號之攝取與處理 32
3.2.1 情緒感知平台 32
3.2.2 心跳感測彈性胸帶 33
3.2.3 藍芽傳輸器 35
3.2.4 情緒刺激 36
3.2.5 情緒感知系統設計 38
3.3 情緒辨識的處理 39
3.3.1 攝取情緒片段 40
3.3.2 獲取特徵值 41
3.3.3 特徵值正規化 41
3.3.4 分類演算法 42
3.3.5 情緒辨識之流程 43
3.3.5.1 訓練流程 43
3.3.5.2 測試流程 43
3.3.6 交叉驗證 44
3.3.7 特徵分析 45
第四章實驗結果與討論 47
4.1 情緒辨識結果 47
4.2 討論 53
4.2.1 實驗結果討論 53
4.2.2 不同觀眾的差異程度 56
4.2.3 單一觀眾的情緒辨識 58
4.2.4 分類演算法的參數最佳化 60
第五章結論與未來展望 61
5.1 結論 61
5.2 未來展望 62
參考文獻 64

[1] L. C. Yeh, et al., “An innovative application over communications-asaservice: Network-based multicast IPTV audience measurement,” in Proc. of 13th Asia-Pacific Network Operations and Management Symposium (APNOMS), Sept. 2011, pp. 1-7.
[2] F. Alvarez, et al., “Audience measurement modeling for convergent broadcasting and IPTV networks,” IEEE Transactions on Broadcasting, vol. 55, issue 2, Jun. 2009, pp. 502-515.
[3] 黃聿清,莊春發,用收視質量度電視節目品質-台灣公共電視台的經驗,中華傳播學會年會論文,2011。
[4] 賴祥蔚,探索電視新聞節目的成功經營模式-從哥倫比亞大學「傑出新聞計畫」成果檢視台灣發展,廣電人,2000,第52-55 頁。
[5] 李美華、黃詩芸,台灣無線數位電視之競爭策略與節目規劃,傳播與管理研究,第九卷,第一期,2009,第63-92 頁。
[6] 黃中,數位機上盒對收視調查的衝擊,碩士論文,國立臺灣大學新聞研究所,臺北,2007。
[7] TiVo – TiVo Advertising Media Kit, What is TiVo, [Online]. Available: http://www3.tivo.com/assets/tivoadvertising/content/files/TiVoAdMediakit_whatistivo.pdf (last date visited: Dec. 12, 2012)
[8] Nielsen - Nielsen Television Audience Measurement, A2/M2 Three Screens, [Online]. Available: http://www.agbnielsen.net/products/a2m2.asp (last date
visited: Dec. 12, 2012)
[9] 彭玉賢,數位時代來臨-收視率調查面臨新變革,公共電視研究發展部,2006。
[10] 江啟彬,基於生物辨識之自動收視率調查系統,碩士論文,暨南國際大學電機工程學系研究所,南投,2012。
[11] B. Gunter and J. M. Wober, The Reactive Viewer: A Review of Research on Audience Reaction Measurement, John Libbey, Apr. 1992.
[12] K. R. Watson, Introduction to Human Anatomy, [Online]. Available: http://www.lavc.edu/instructor/watson_k/docs/Lecture%2015%20-%20%20Autonomic%20Nervous%20System.ppt (last date visited: Dec. 12, 2012)
[13] Ken Grauer, A Practical Guide to ECG Interpretation, Mosby Inc., Jun. 1998.
[14] Frank G. Yanowitz, ECG Learning center- ECG outline, [Online]. Available: http://ecg.utah.edu/lesson/1 (last date visited: Dec. 12, 2012)
[15] Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology, “Heart rate variability: standards of measurement, physiological interpretation, and clinical use,” Circulation, vol. 93, Mar. 1996, pp. 1043-1065.
[16] R.W. Picard, “Affective Computing,” Technical Report, MIT Media Laboratory Perceptual Computing Section Technical Report No. 321, 1995.
[17] A. R. Damasio, “Emotion in the perspective of an integrated nervous system,” Brain Research Reviews, vol. 26, issue 2, May 1998, pp. 83-86.
[18] 陶振超,情緒在媒介訊息處理中扮演的角色:認知取徑媒體研究之觀點,中華傳播學會年會論文,2011。
[19] M. M. Bradley and Peter J. Lang, “Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential,” Journal of Behavior Therapy and Experimental Psychiatry, vol. 25, issue 1, Mar. 1994, pp. 49-59.
[20] K.H. Kim, S.W. Bang and S.R. Kim, “Emotion recognition system using shortterm monitoring of physiological signals,” Medical and Biological Engineering and Computing, vol. 42, issue 3, May 2004, pp. 419-427.
[21] G. E. Schwartz, et al., “Facial EMG in the assessment of emotion,” Psychophysiology, vol. 11, no 2, 1974, pp. 237.
[22] A. Schaefer, et al., “Assessing the effectiveness of a large database of emotioneliciting films: A new tool for emotion researchers, “ Cognition & Emotion, vol. 24, issue 7, Oct. 2010, pp. 1153-1172.
[23] P. J. Lang, M. M. Bradley and B. N. Cuthbert, “International Affective Picture System (IAPS): Instruction Manual and Affective Ratings,” Technical Report, University of Florida, Gainesville, FL., 1997.
[24] R. Picard, E Vyzaz, and J. Healey, “Toward Machine Emotional Intelligence: Analysis of Affective Physiological State,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vo1. 23, no. 10, Oct. 2001, pp. 1175-1191.
[25] Wendy S. Ark, D. C. Dryer and D. J. Lu, “The Emotion Mouse, “ in Proc. of HCI International (the 8th International Conference on Human-Computer Interaction) on Human-Computer Interaction, Ergonomics and User Interfaces, vol. 1, 1999, pp. 818-823.
[26] P. Ekman and W.V. Friesen, “Constants across cultures in the face and emotion, “ Journal of Personality and Social Psychology, vol. 17, no. 2, Feb. 1971, pp. 124-129.
[27] M. W. Bhatti, Y. Wang, and L. Guan, “A neural network approach for human emotion recognition in speech,” in Proc. of the International Symposium on Circuits and System, vol. 2, May 2004, pp. 181-184.
[28] C. H. Chen L. F. Pau and P. S. P. Wang, Handbook of Pattern Recognition and Computer Vision, World Scientific, 2005.
[29] J. A. Russell, “The circumplex model of affect,” Journal of Personality and Social Psychology, vol. 39, issue 6, 1980, pp. 1161-1178.
[30] C. L. Lisetti and F. Nasoz, “Using Noninvasive Wearable Computers to Recognize Human Emotion from Physiological Signal,” EURASIP Journal on Applied Signal Processing, vol. 2004, Jan. 2004, pp. 1672-1687.
[31] 慎基德,以生理信號探討多媒體使用環境之使用者情感反應,碩士論文,國立臺灣大學電機工程學研究所,臺北,2006。

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top