跳到主要內容

臺灣博碩士論文加值系統

(44.192.114.32) 您好!臺灣時間:2022/07/07 02:39
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:陳巧瑜
研究生(外文):Chiao-Yu Chen
論文名稱:同步視聽訊息處理對語音知覺影響的功能性磁振造影研究
論文名稱(外文):Neural Correlates of Vowel Perception Modulated by Audio-visual Asynchrony
指導教授:林發暄
指導教授(外文):Fa-Hsuan Lin
口試委員:黃騰毅王福年蔡尚岳林益如
口試日期:2011-07-15
學位類別:碩士
校院名稱:國立臺灣大學
系所名稱:醫學工程學研究所
學門:工程學門
學類:綜合工程學類
論文種類:學術論文
論文出版年:2011
畢業學年度:99
語文別:英文
論文頁數:31
中文關鍵詞:視覺聽覺功能性磁振造影多通道感官整合時間腦島顳葉上迴母音
外文關鍵詞:audiovisualfunctional magnetic resonance imagingmultisensory integrationtemporalinsulasuperior temporal gyrusvowel
相關次數:
  • 被引用被引用:1
  • 點閱點閱:513
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
多重感官的整合(multisensory integration)能使我們對事物可以有整體統一的認知 ,並提高我們對事物認知的精確度。舉例來說,日常生活中與人面對面的交談通常是藉由多重感官整合來完成:我們接收到來自對方的視覺(唇語)與聽覺(講話聲音)訊號。來自各感官通道(視覺與聽覺)刺激彼此的時間靠近與否,是多重感官訊息能否整合的關鍵因素。然而視覺與聽覺刺激的接收並不一定要是絕對的同時發生才能產生多重感官的整合。過去的行為研究發現,視覺領先聲音會比聲音領先視覺還要不容易察覺發生訊號時間不同步的狀況。
此一現象可以用容許整合視覺與聽覺的時間區間(window of temporal integration)概念來說明:當刺激彼此的時間差只要落在這個範圍內,即使刺激是不同步出現的,人們也不會察覺。研究指出在人類說話的例子中,當視覺領先聽覺258 ms以內,人們仍可感知到視聽是同步的;而視覺領先聽覺500 ms以後,人們可以100%感知到視聽是不同步的。
本論文想研究人腦在處理行為上無法感知到視聽不同步的視聽整合及可感知到視聽不同步的視聽整合時,對多通道刺激間時間差的腦神經活動區域是否發生在腦部負責整合多通道訊息的整合區(multisensory integration area)。我們讓受試者看母音發音時的臉部,並播放和視覺時間同步、時間落後168ms及時間落後500ms的母音發音的聲音。其中,聽覺落後視覺168ms的時間差是在受試者不會察覺不同步的時間整合範圍內,而聽覺落後視覺500ms的時間差是在授事者會察覺不同步的時間整合範圍外。用功能性磁振造影(functional magnetic resonance imaging)來觀察人腦受刺激時腦部血氧濃度的變化(blood oxygenation level-dependency)。我們比較在聽覺落後視覺168ms時和視聽同步時腦部血氧活動的情形,發現在左邊腦島(insula)的位置有顯著的活動。比較在聽覺落後視覺500ms時和視聽同步時腦部血氧活動的情形,發現除了左邊腦島,在左右半腦的顳葉上迴(superior temporal gyrus)處也有顯著的活動。我們推斷,聽覺落後視覺的時間若是落在不會察覺不同步的範圍內時,只需要左腦島就可整合不同時的視聽訊號;而聽覺落後時覺得時間若是可被查覺到不同步時,除了左腦島,還需要左右半腦的顳葉上迴參與整合。


Intersensory temporal synchrony is critical in multisensory integration. Human brain has the ability to combine information from different sense modalities as a unified perception when stimuli are presented with a minor temporal latency . For example, we have congruent speech perception even though auditory (heard speech) and visual (lip-reading) stimuli are sent through different physical mechanisms and speeds. Little is known about how brain detects temporal asynchrony between audiovisual stimuli. To study where in the brain is sensitive to asynchrony of audio-visual speech and associated with vowel processing, we presented vowel sounds and short video clips of the articulatory gestures with three audio-visual latencies, which are simultaneous, asynchronous inside the window of temporal integration, and asynchronous outside the temporal integration. Based on the BOLD-contrast fMRI from11 participants, we found that the bilateral superior temporal gyri have stronger activation when the audio-visual stimulus latency increase. The left insula is more significantly activated by asynchronous audio-visual stimuli with latency inside the integration window than simultaneous stimuli. Not only the insula but also bilateral superior temporal gyri are activated significantly when audio-visual stimuli are asynchrous with a latency outside the integration window. These results suggest that bilateral superior gyri and left insula integrate the lagged auditory stimuli to the visual stimuli, and the left insula is more sensitive to the temporal the asynchrony of audio-visual speech than bilateral gyri.

List of Figures II
中文摘要 V
ABSTRACT VI
Chapter 1 Introduction 1
1.1 Multisensory integration 1
1.2 The window of temporal integration 3
Chapter 2 Materials and Methods 8
2.1 Participants 8
2.2 Stimulation Procedure 8
2.3 Task 10
2.4 Scanning Procedure 16
Chapter 3 Results 17
3.1 Behavioral responses 17
3.2 fMRI results 17
Chapter 4 Discussion 25
References 29


1.Calvert, G.A., Spence, C., &Stein, B. E. , ed. The handbook of multisensory processes. Cambridge MA: MIT Press. 2004.
2.Balk, M.H., et al., Synchrony of audio-visual speech stimuli modulates left superior temporal sulcus. Neuroreport, 2010. 21(12): p. 822-6.
3.Mcgurk, H. and J. Macdonald, Hearing Lips and Seeing Voices. Nature, 1976. 264(5588): p. 746-748.
4.Stein, B., and Meredith, M. A., The Merging of the Senses. 1993.
5.Calvert, G.A., Crossmodal processing in the human brain: insights from functional neuroimaging studies. Cereb Cortex, 2001. 11(12): p. 1110-23.
6.Hartline, P.H., et al., Effects of eye position on auditory localization and neural representation of space in superior colliculus of cats. Exp Brain Res, 1995. 104(3): p. 402-8.
7.Alais, D., F.N. Newell, and P. Mamassian, Multisensory processing in review: from physiology to behaviour. Seeing and Perceiving, 2010. 23(1): p. 3-38.
8.Dixon N F, S.L., The detection of auditory visual desynchrony. Perception, 1980. 9(6): p. 719-721.
9.Ken W. Grant, S.G., Speech intelligibility derived from asynchronous processing of auditory-visual information. 2001.
10.Spence, C. and S. Squire, Multisensory integration: maintaining the perception of synchrony. Curr Biol, 2003. 13(13): p. R519-21.
11.Vroomen, J. and M. Keetels, Perception of intersensory synchrony: a tutorial review. Atten Percept Psychophys, 2010. 72(4): p. 871-84.
12.Lewkowicz, D.J., Perception of auditory–visual temporal synchrony in human infants. Journal of Experimental Psychology: Human Perception and Performance, 1996. 22(5): p. 1094-1106.
13.Massaro, D.W.C., Michael M.; Smeele, Paula M. T., Perception of asynchronous and conflicting visual and auditory speech. Journal of the Acoustical Society of America, 1996. Vol 100(3): p. 1777-1786.
14.Conrey, B. and D.B. Pisoni, Auditory-visual speech perception and synchrony detection for speech and nonspeech signals. Journal of the Acoustical Society of America, 2006. 119(6): p. 4065-4073.
15.Obleser, J., et al., Vowel sound extraction in anterior superior temporal cortex. Hum Brain Mapp, 2006. 27(7): p. 562-71.
16.Penfield, W. and M.E. Faulk, Jr., The insula; further observations on its function. Brain, 1955. 78(4): p. 445-70.
17.Buchel, C., et al., Brain systems mediating aversive conditioning: an event-related fMRI study. Neuron, 1998. 20(5): p. 947-57.
18.Fischer, H., et al., Brain correlates of an unexpected panic attack: a human positron emission tomographic study. Neurosci Lett, 1998. 251(2): p. 137-40.
19.Ploghaus, A., et al., Dissociating pain from its anticipation in the human brain. Science, 1999. 284(5422): p. 1979-81.
20.Graybiel, A.M., The thalamo-cortical projection of the so-called posterior nuclear group: a study with anterograde degeneration methods in the cat. Brain Research, 1973. 49(2): p. 229-44.
21.Guldin, W.O. and H.J. Markowitsch, Cortical and thalamic afferent connections of the insular and adjacent cortex of the cat. J Comp Neurol, 1984. 229(3): p. 393-418.
22.Hicks, T.P., G. Benedek, and G.A. Thurlow, Modality specificity of neuronal responses within the cat''s insula. J Neurophysiol, 1988. 60(2): p. 422-37.
23.Bushara, K.O., J. Grafman, and M. Hallett, Neural correlates of auditory-visual stimulus onset asynchrony detection. Journal of Neuroscience, 2001. 21(1): p. 300-304.
24.Calvert, G.A., et al., Activation of auditory cortex during silent lipreading. Science, 1997. 276(5312): p. 593-6.


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊