(3.238.186.43) 您好!臺灣時間:2021/02/28 20:42
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:柯佐安
研究生(外文):Andreas Felix Goetze
論文名稱:分類局部傳遞熵及其在重建神經網絡中的應用和伊辛逆問題
論文名稱(外文):Sorted local transfer entropy and its application to reconstructing neural networks and the inverse Ising problem
指導教授:黎璧賢黎璧賢引用關係
指導教授(外文):Pik-Yin Lai
學位類別:博士
校院名稱:國立中央大學
系所名稱:物理學系
學門:自然科學學門
學類:物理學類
論文種類:學術論文
論文出版年:2020
畢業學年度:108
語文別:英文
論文頁數:92
中文關鍵詞:轉移熵神經網絡有效的連接易辛模型
外文關鍵詞:transfer entropyneural networkseffective connectivityIsing model
相關次數:
  • 被引用被引用:0
  • 點閱點閱:42
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:9
  • 收藏至我的研究室書目清單書目收藏:0
最近的研究已使用傳遞熵來測量大量神經元之間的有效連結。對這些網絡的 分析為神經網絡中的信息傳遞提供了新穎的見解。信息傳遞是通過傳遞熵的 估計來量化的,傳遞熵是作為無模型方法測量神經元之間有向線性和非線性 相互作用的方法。兩個序列之間的高信息傳遞是神經元之間興奮性突觸的證 據。但是抑制性突觸也顯示出重要的信息傳遞。我們通過揭示信息傳遞是來 自興奮性突觸還是抑制性突觸來擴展有效連結的分析。為了區分這些類型的 相互作用,我們分析了每種傳遞類型的符號相反的局部傳遞熵,從而使我們 能夠區分分類後的局部傳遞熵。我們進一步探索動態狀態條件以估計傳遞熵, 以消除神經群體中高度同步的驟放事件期間的網絡效應。像以前的研究一樣, 我們將這些方法應用於具有隨機突觸延遲和可塑性的 Izhikevich 神經元網絡 進行神經元發火序列的模擬,我們證明了可以推斷出抑制性和興奮性突觸, 並改善了網絡重構。此外,我們證明分類局部傳遞熵對於解決伊辛逆問題也 很有用。我們的模擬中表明,對於一個隨機連接的 Ising 網絡系統,我們可以 通過估計成對分類局部傳遞熵來推斷正和負相互作用的強度。
Recent studies have used transfer entropy to measure the effective connectivity among large populations of neurons. Analyzing these networks gave novel insight on the information transfer in neural networks [1]. Transfer of information as quantified by the estimation of transfer entropy [2] which measures the directed linear and non-linear interactions between neurons as a model-free method. High information transfer between two spike trains is evidential for an underlying excitatory synapse between the neurons. However, also inhibitory synapses show significant information transfer. We extend the effective connectivity analysis by revealing whether the information transfer is coming from an excitatory or an inhibitory synapse. To distinguish these types of interactions we analyze the local transfer entropies [3] which are opposite signed for each interaction type, allowing us to define the sorted local transfer entropy as the discriminating quantity. We further explore dynamic state conditioning for estimating transfer entropy [4] in order to remove the network effects during highly synchronized bursting events in the neural population which are not indicative of a direct synaptic interaction. Applying these techniques to the spike trains of simulated networks of Izhikevich neurons with random synaptic delays and spike-timing-dependent plasticity evolved connection weights like in a previous study [5], we show that inhibitory and excitatory synapses can be inferred and the network reconstruction improved.
Furthermore we show that sorted local transfer entropy is also useful to solve the inverse Ising problem [6]. In our simulations we show that for a system of randomly connected Ising nodes we can infer the interaction strength for both positive and negative interactions by estimating the pairwise sorted local transfer entropies.
Chinese Abstract i
Abstract ii
Acknowledgements iv
1 Introduction 1
1.1 Background............................. 1
1.2 Summary of chapters........................ 4
2 Background on Information Theory 6
2.1 Information theory basics ..................... 6
2.2 Information theory basics for a discrete dynamical process . . . 8
2.2.1 Dynamic complex system of interacting components . . . 9
2.3 Pointwise information theory ................... 10
2.3.1 Canonic example: two binary variables .......... 12
2.3.2 Sorted pointwise mutual information . .......... 13
3 Network Reconstruction in Neural Networks 15
3.1 Methodology ............................ 15
3.1.1 Transfer Entropy . . . . . . . . . . . . . . . . . . . . . . 15
3.1.2 Local transfer entropy ................... 17
3.1.3 Sorted local transfer entropy................ 17
3.1.4 Surrogate tests ....................... 19
3.1.5 Interaction delay ...................... 19
3.1.6 Multivariate transfer entropy ............... 20
3.1.7 State-conditioned transfer entropy . . . . . . . . . . . . 22
3.1.8 GPU computing ...................... 23
3.2 Fitzhugh-Nagumo motif simulations. . . . . . . . . . . . . . . . 23
3.3 Izhikevichnetworksimulations................... 27
3.4 Reconstruction ........................... 30
3.5 Results................................ 32
3.5.1 Fitzhugh-Nagumo neurons................. 32
3.5.2 Firing dynamics in large neural networks . . . . . . . . . 32
3.5.3 Izhikevich simulations with regularly firing neurons . . . 35
3.5.4 Izhikevich simulations with bursting dynamics . . . . . . 35
3.5.5 Performance summary ................... 43
3.5.6 Clustering in the interaction classification . . . . . . . . 44
4 Reconstructing Positive and Negative couplings in Ising spin Networks by Sorted Local Transfer Entropy 46
4.1 Introduction............................. 47
4.2 Transfer Entropy and Sorted Local Transfer Entropy . . . . . . 50
4.3 Ising Spin Network reconstruction using SLTE . . 53
4.3.1 Glauber Dynamics: asynchronous update ... 54
4.3.2 Glauber Dynamics: synchronous update ...57
4.3.3 Kawasaki Spin Exchange Dynamics . . . 62
4.4 Summary and Outlook....................... 63
5 Conclusion 67
References 69
[1] S. Nigam, M. Shimono, S. Ito, F.-C. Yeh, N. Timme, M. Myroshnychenko, C. C. Lapish, Z. Tosi, P. Hottowy, W. C. Smith, and others, The Journal of Neuroscience 36, 670 (2016).
[2] T. Schreiber, Physical Review Letters 85, 461 (2000).
[3] J. T. Lizier, M. Prokopenko, and A. Y. Zomaya, Phys. Rev. E 77, 026110 (2008).
[4] O. Stetter, D. Battaglia, J. Soriano, and T. Geisel, PLoS Computational Biology 8, (2012).
[5] S. Ito, M. E. Hansen, R. Heiland, A. Lumsdaine, A. M. Litke, and J. M. Beggs, PLoS ONE 6, (2011).
[6] F. Goetze and P.-Y. Lai, Physical Review E 100, (2019).
[7] L. Barnett, Physical Review Letters 103, (2009).
[8] M. Wibral, R. Vicente, and J. T. Lizier, Directed Information Measures in Neuroscience (2014).
[9] O. Kwon and J.-S. Yang, EPL 82, 68003 (2008).
[10] E. Crosato, L. Jiang, V. Lecheval, J. T. Lizier, X. R. Wang, P. Tichit, G. Theraulaz, and
M. Prokopenko, Swarm Intell 12, 283 (2018).
[11] J. Runge, J. Heitzig, N. Marwan, and J. Kurths, Phys. Rev. E 86, 061121 (2012).
[12] M. Pellicoro and S. Stramaglia, Physica A: Statistical Mechanics and Its Applications 389, 4747 (2010).
[13] L. Barnett, J. T. Lizier, M. Harré, A. K. Seth, and T. Bossomaier, Phys. Rev. Lett. 111, 177203 (2013).

[14] L. Novelli, P. Wollstadt, P. Mediano, M. Wibral, and J. T. Lizier, (2019).
[15] C. E. Shannon, Bell System Technical Journal 27, 623 (1948).
[16] D. J. MacKay, Information Theory, Inference, and Learning Algorithms (Citeseer, 2003).
[17] J. T. Lizier, The Local Information Dynamics of Distributed Computation in Complex Systems, Springer Science & Business Media, 2012.
[18] C. D. Manning, C. D. Manning, and H. Schütze, Foundations of Statistical Natural Language Processing (MIT Press, 1999).
[19] B. Gourevitch and J. J. Eggermont, Journal of Neurophysiology 97, 2533 (2007). [20] E. M. Izhikevich, Neural Computation 18, 245 (2006).
[21] M. Wibral, N. Pampu, V. Priesemann, F. Siebenhühner, H. Seiwert, M. Lindner, J. T. Lizier, and R. Vicente, PloS One 8, (2013).
[22] J. Runge, Chaos: An Interdisciplinary Journal of Nonlinear Science 28, 075310 (2018). [23] A. Palmigiano, T. Geisel, F. Wolf, and D. Battaglia, Nat Neurosci advance online
publication, (2017).
[24] A. Borisyuk, A. Friedman, B. Ermentrout, and D. Terman, Tutorials in Mathematical
Biosciences I (Springer Berlin Heidelberg, Berlin, Heidelberg, 2005).
[25] R. FitzHugh, Biophysical Journal 1, 445 (1961).
[26] A. L. Hodgkin and A. F. Huxley, The Journal of Physiology 117, 500 (1952). [27] E. Izhikevich, IEEE Transactions on Neural Networks 14, 1569 (2003).
[28] S. Song, K. D. Miller, and L. F. Abbott, Nat Neurosci 3, 919 (2000).
[29] T. Fawcett, Pattern Recognition Letters 27, 861 (2006).
[30] M. Timme, Europhysics Letters (EPL) 76, 367 (2006).
[31] W. Wang, Y.-C. Lai, and C. Grebogi, Physics Reports 644, 1 (2016).
[32] M. Nitzan, J. Casadiego, and M. Timme, Science Advances 3, (2017).
[33] D. Yu, M. Righero, and L. Kocarev, Physical Review Letters 97, (2006).
[34] M. Timme, Physical Review Letters 98, (2007).
[35] S. G. Shandilya and M. Timme, New Journal of Physics 13, 013004 (2011).
[36] Z. Levnajić and A. Pikovsky, Physical Review Letters 107, (2011).
[37] Z. Levnajić and A. Pikovsky, Scientific Reports 4, (2015).
[38] E. S. C. Ching, P.-Y. Lai, and C. Y. Leung, Physical Review E 88, (2013).
[39] E. S. C. Ching, P.-Y. Lai, and C. Y. Leung, Physical Review E 91, (2015).
[40] E. S. C. Ching and H. C. Tam, Physical Review E 95, (2017).
[41] P.-Y. Lai, Physical Review E 95, (2017).
[42] H. Tam, E. S. Ching, and P.-Y. Lai, Physica A: Statistical Mechanics and Its Applications (2018).
[43] H. J. Kappen and F. B. Rodríguez, Neural Computation 10, 1137 (1998).
[44] Y. Roudi, J. Tyrcha, and J. Hertz, Physical Review E 79, (2009).
[45] Y. Roudi and J. A. Hertz, Physical Review Letters 106, (2011).
[46] H.-L. Zeng, E. Aurell, M. Alava, and H. Mahmoudi, Physical Review E 83, (2011).
[47] H.-L. Zeng, M. Alava, E. Aurell, J. Hertz, and Y. Roudi, Physical Review Letters 110, 210601 (2013).
[48] P. Zhang, Journal of Statistical Physics 148, 502 (2012).
[49] E. Aurell and M. Ekeberg, Physical Review Letters 108, (2012).
[50] S. L. Dettmer, H. C. Nguyen, and J. Berg, Physical Review E 94, 052116 (2016).
[51] J. Albert and R. H. Swendsen, Physics Procedia 57, 99 (2014).
[52] J. Albert and R. H. Swendsen, Physica A: Statistical Mechanics and Its Applications 483, 293 (2017).
[53] T. Bossomaier, L. Barnett, M. Harré, and J. T. Lizier, An Introduction to Transfer
Entropy (Springer International Publishing, Cham, 2016).
[54] F. Doria, R. Erichsen Jr., D. Dominguez, M. González, and S. Magalhaes, Physica A:
Statistical Mechanics and Its Applications 422, 58 (2015).
[55] M. Li, Y. Fan, J. Wu, and Z. Di, International Journal of Modern Physics B 27, 1350146
(2013).
[56] Hon Lau and Peter Grassberger, 87, (2013).
[57] Z. Deng, J. Wu, and W. Guo, Physical Review E 90, (2014).
[58] J. G. Orlandi, O. Stetter, J. Soriano, T. Geisel, D. Battaglia, and J. Garcia-Ojalvo, PLoS ONE 9, (2014).
[59] H. P. Robinson, M. Kawahara, Y. Jimbo, K. Torimitsu, Y. Kuroda, and A. Kawana, Journal of Neurophysiology 70, 1606 (1993).
[60] L. C. Jia, M. Sano, P.-Y. Lai, and C. K. Chan, Physical Review Letters 93, (2004).
[61] P.-Y. Lai, L. C. Jia, and C. K. Chan, Physical Review E 73, (2006).
[62] H. Song, C.-C. Chen, J.-J. Sun, P.-Y. Lai, and C. K. Chan, Physical Review E 90, 012703 (2014).
[63] M. Prokopenko, J. Lizier, and D. Price, Entropy 15, 524 (2013). [64] M. Prokopenko and I. Einav, Physical Review E 91, (2015).
[65] G. V. Steeg and A. Galstyan, (2011).
[66] G. Ver Steeg and A. Galstyan, CoRR (2012).
[67] T. Tomokiyo and M. Hurst, in Proceedings of the ACL 2003 Workshop on Multiword Expressions: Analysis, Acquisition and Treatment-Volume 18 (Association for Computational Linguistics, 2003), pp. 33–40.
[68] F. Goetze, P.-Y. Lai, and C. Chan, BMC Neuroscience 16, (2015).
[69] G. Bouma, in Proceedings of the Biennial GSCL Conference (2009).
[70] C. Finn and J. T. Lizier, arXiv Preprint arXiv:1801.09223 (2018).
[71] B. Gourévitch and J. J. Eggermont, Journal of Neurophysiology 97, 2533 (2007). [72] P. Erds and A. Rényi, Publ. Math. Inst. Hung. Acad. Sci 5, 17 (1960).
[73] B. Bollobás, Random Graphs (Academic, London, 1985).
[74] R. J. Glauber, Journal of Mathematical Physics 4, 294 (1963).
[75] P. Wollstadt, M. Martínez-Zarzuela, R. Vicente, F. J. Díaz-Pernas, and M. Wibral, arXiv Preprint arXiv:1401.4068 (2014).
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔