(3.230.143.40) 您好!臺灣時間:2021/04/23 16:49
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:何浩源
研究生(外文):Hao-Yuan He
論文名稱:液體狀態機中的非線性現象:隨機共振與臨界現象之研究
論文名稱(外文):Nonlinear phenomenon in liquid state machine: stochastic resonance and critical phenomenon
指導教授:陳義裕陳義裕引用關係陳志強陳志強引用關係
指導教授(外文):Yih-Yuh ChenChi-Keung Chen
口試委員:陳俊仲
口試委員(外文):Chun-Chung Chen
口試日期:2016-06-13
學位類別:碩士
校院名稱:國立臺灣大學
系所名稱:物理學研究所
學門:自然科學學門
學類:物理學類
論文種類:學術論文
論文出版年:2016
畢業學年度:104
語文別:英文
論文頁數:65
中文關鍵詞:機器學習類神經網路隨機共振臨界現象自回歸模型
外文關鍵詞:Machine learningArtificial neural networkStochastic resonanceCritical phenomenonAuto-regressive model
相關次數:
  • 被引用被引用:0
  • 點閱點閱:119
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
Liquid state machine (LSM)藉由模擬神經網路來達成機器學習中的分類工作。為了增進LSM的工作效率,我們針對LSM中的非線性性質進行進一步的研究。首先,我們研究一種被稱為隨機共振的非線性現象。雖然一般而言雜訊對於一個機器的效能是破壞性的,但在一些非線性系統內,卻可藉由加入雜訊來增加訊噪比。我們首先在LSM中觀察到隨機共振,並且觀察到一個現象:對同樣的一組資料,LSM在有雜訊的環境下進行學習的穩定性反而更高,這顯示雜訊的存在不但可以幫助訊號傳遞,還可以幫助神經網路進行學習。第二,我們研究LSM中的自組織臨界現象。許多人相信神經網路在臨界狀態中可以有更好的工作效率,因此研究臨界現象對於LSM效率的提升是重要的。與前人多使用power-law分布來決定臨界狀態不同,我們使用自回歸模型研究神經間的相關性來決定臨界狀態。我們發現在自回歸模型的特徵值最接近1的時候,LSM有著最佳的工作效率,說明可以使用這個方法來判斷神經網路的臨界特性。

Liquid state machine (LSM) is an artificial neural network that does classification task by simulating spiking neurons. In order to improve the performance of LSM, we analyze the nonlinear phenomena behind it. In this study, two topics are studied. First, we analyze the nonlinear effect called stochastic resonance, which describes the effect of noise. While noise is an unwelcome feature in most system, it is possible for noise to enhance the performance in a nonlinear system. We observe that stochastic resonance can occurs in LSM, and show that the existence of noise can also help a neural network to ''learn''. Second, we study the critical phenomenon in LSM. Since many people believe a neural network has best performance under critical state, it is an important issue to determine whether our LSM is in criticality. Usually, people use statistical method such as power-law to determine the criticality. However, we use auto-regressive model which estimate the dynamical correlation between neurons to find critical state. In this study, we will show the LSM has best performance where the eigenvalue distribution in auto-regressive model is closest to 1.

Page
Chapter 1 Introduction 3
1.1 Neural network 3
1.2 Arti cial Neural Network 3
1.3 Liquid State Machine 5
1.4 Stochastic Resonance 7
1.5 Critical phenomenon 8
Chapter 2 Method 11
2.1 Liquid State Machine 11
2.1.1 Input layer 11
2.1.2 Liquid layer 12
2.1.3 Readout layer 13
2.2 Models of neuron and synapse 14
2.2.1 Neuron 14
2.2.2 Leaky Integrate-and-Fire Model 15
2.2.3 Tsodyks-Markram model (TM model) 16
2.3 Connecting topology 17
Chapter 3 Stochastic Resonance 21
3.1 Introduction to Stochastic Resonance 21
3.2 Simulation Setup 22
3.3 Results 24
3.3.1 Noise on Membrane Potential 24
3.3.2 Add noise under learning 26
3.3.3 Noise on Synapses 28
3.4 Summary 30
Chapter 4 Auto-Regressive Model 33
4.1 Introduction 33
4.2 How AR model works? 33
4.3 Simulation Setup 35
4.3.1 Ising Model 35
4.3.2 Simulation Setup for LSM 39
4.4 Results 41
4.4.1 AR Model on Liquid State Machine 42
Chapter 5 Conclusion and Discussion 51
Appendix A 55
A.1 Parameters for LSM 55
A.2 Setup of LSM 58
A.2.1 Input Pattern Generation 58
A.2.2 Small World Network Generation 58
A.2.3 Neuron dynamics 58
Reference 61

1. Enrico Simonotto, Massimo Riani, Charles Seife, Mark Roberts, Jennifer Twitty, and Frank Moss. Visual perception of stochastic resonance. Physical review letters, 78(6):1186, 1997.
2. Danielle Smith Bassett and ED Bullmore. Small-world brain networks. The neuroscientist, 12(6):512 523, 2006.
3. Wolfgang Maass. Liquid state machines: motivation, theory, and applications. Computability in context: computation and logic in the real world, pages 275
296, 2010.
4. Henry Markram, Yun Wang, and Misha Tsodyks. Di erential signaling via the same axon of neocortical pyramidal neurons. Proceedings of the National Academy of Sciences, 95(9):5323 5328, 1998.
5. Misha V Tsodyks and Henry Markram. The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability. Proceedings of the National Academy of Sciences, 94(2):719 723, 1997.
6. JJ Collins, Carson C Chow, Thomas T Imho , et al. Stochastic resonance without tuning. Nature, 376(6537):236 238, 1995.
7. Duncan J Watts and Steven H Strogatz. Collective dynamics of ‘small-world’networks. nature, 393(6684):440 442, 1998.
8. Alessandro Treves. Mean- eld analysis of neuronal spike dynamics. Network: Computation in Neural Systems, 4(3):259 284, 1993.
9. Ji-Zheng Chu, Shyan-Shu Shieh, Shi-Shang Jang, Chuan-I Chien, Hou-Peng Wan, and Hsu-Hsun Ko. Constrained optimization of combustion in a simulated coal- red boiler using arti cial neural network model and information analysis.
Fuel, 82(6):693 703, 2003.
10. Dilip Goswami, Klaus Schuch, Yi Zheng, Tom DeMarse, and Jose C Principe. Towards the modeling of dissociated cortical tissue in the liquid state machine framework. In Neural Networks, 2005. IJCNN’05. Proceedings. 2005 IEEE In-
ternational Joint Conference on, volume 4, pages 2179 2183. IEEE, 2005.
11. Kurt Hornik, Maxwell Stinchcombe, and Halbert White. Multilayer feedforward networks are universal approximators. Neural networks, 2(5):359 366, 1989.
12. Guang-Bin Huang, Qin-Yu Zhu, and Chee-Kheong Siew. Extreme learning machine: a new learning scheme of feedforward neural networks. In Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on,
volume 2, pages 985 990. IEEE, 2004.
13. Sen Song, Kenneth D Miller, and Larry F Abbott. Competitive hebbian learning through spike-timing-dependent synaptic plasticity. Nature neuroscience, 3(9):919 926, 2000.
14. Wolfgang Maass, Thomas Natschläger, and Henry Markram. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural computation, 14(11):2531 2560, 2002.
15. Amir F Atiya and Alexander G Parlos. New results on recurrent network training: unifying the algorithms and accelerating convergence. Neural Networks, IEEE Transactions on, 11(3):697 709, 2000.
16. Nils Bertschinger and Thomas Natschläger. Real-time computation at the edge of chaos in recurrent neural networks. Neural computation, 16(7):1413 1436, 2004.
17. Elad Schneidman, Michael J Berry, Ronen Segev, and William Bialek. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature, 440(7087):1007 1012, 2006.
18. Thierry Mora, StØphane Deny, and Olivier Marre. Dynamical criticality in the collective activity of a population of retinal neurons. Physical review letters,
114(7):078105, 2015.
19. Han Ju, Jian-Xin Xu, and Antonius MJ VanDongen. Classi cation of musical styles using liquid state machines. In Neural Networks (IJCNN), The 2010 International Joint Conference on, pages 1 7. IEEE, 2010.
20. Stefan Schliebs and Doug Hunt. Continuous classi cation of spatio-temporal data streams using liquid state machines. In International Conference on Neural Information Processing, pages 626 633. Springer, 2012.
21. Hananel Hazan and Larry M Manevitz. Topological constraints and robustness in liquid state machines. Expert Systems with Applications, 39(2):1597 1606, 2012.
22. Luca Gammaitoni, Peter Hänggi, Peter Jung, and Fabio Marchesoni. Stochastic resonance. Reviews of modern physics, 70(1):223, 1998.
23. Roberto Benzi, Giorgio Parisi, Alfonso Sutera, and Angelo Vulpiani. Stochastic resonance in climatic change. Tellus, 34(1):10 16, 1982.
24. John K Douglass, Lon Wilkens, Eleni Pantazelou, Frank Moss, et al. Noise enhancement of information transfer in cray sh mechanoreceptors by stochastic resonance. Nature, 365(6444):337 340, 1993.
25. Per Bak, Chao Tang, and Kurt Wiesenfeld. Self-organized criticality: An explanation of the 1/f noise. Physical review letters, 59(4):381, 1987.
26. Kai J Miller, Larry B Sorensen, Je rey G Ojemann, and Marcel Den Nijs. Power-law scaling in the brain surface electric potential. PLoS Comput Biol, 5(12):e1000609, 2009.
27. Wolfgang Maass and Henry Markram. On the computational power of circuits of spiking neurons. Journal of computer and system sciences, 69(4):593 616, 2004.
28. Ismail Uysal and John G Harris. Biologically plausible speech recognition using spike-based phase locking cues. In Circuits and Systems, 2009. ISCAS 2009. IEEE International Symposium on, pages 101 104. IEEE, 2009.
29. Ronald A Fisher. The use of multiple measurements in taxonomic problems. Annals of eugenics, 7(2):179 188, 1936.
30. David E Rumelhart, Geo rey E Hinton, and Ronald J Williams. Learning representations by back-propagating errors. Cognitive modeling, 5(3):1, 1988.
31. Frank Rosenblatt. The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review, 65(6):386, 1958.
32. David Norton and Dan Ventura. Improving liquid state machines through iterative re nement of the reservoir. Neurocomputing, 73(16):2893 2904, 2010.
33. Ulf D Schiller and Jochen J Steil. Analyzing the weight dynamics of recurrent learning algorithms. Neurocomputing, 63:5 23, 2005.
34. Anthony N Burkitt. A review of the integrate-and- re neuron model: I. homogeneous synaptic input. Biological cybernetics, 95(1):1 19, 2006.
35. Prashant Joshi. From memory-based decisions to decision-based movements: A model of interval discrimination followed by action selection. Neural networks, 20(3):298 311, 2007.
36. Shan Yu, Debin Huang, Wolf Singer, and Danko Nikoli¢. A small world of neuronal synchrony. Cerebral cortex, 18(12):2891 2901, 2008.
37. Stanley Milgram. The small world problem. Psychology today, 2(1):60 67, 1967.
38. Bruce J Gluckman, Theoden I Neto , Emily J Neel, William L Ditto, Mark L Spano, and Steven J Schi . Stochastic resonance in a neuronal network from mammalian brain. Physical Review Letters, 77(19):4098, 1996.
39. Clayton Haldeman and John M Beggs. Critical branching captures activity in living neural networks and maximizes the number of metastable states. Physical review letters, 94(5):058101, 2005.
40. Osame Kinouchi and Mauro Copelli. Optimal dynamical range of excitable networks at criticality. Nature physics, 2(5):348 351, 2006.
41. Donald B Percival and Andrew T Walden. Spectral analysis for physical applications. Cambridge University Press, 1993.
42. Leandro M Alonso, Alex Proekt, Theodore H Schwartz, Kane O Pryor, Guillermo A Cecchi, and Marcelo O Magnasco. Dynamical criticality during induction of anesthesia in human ecog recordings. Frontiers in neural circuits, 8, 2014.
43. Arnold Neumaier and Tapio Schneider. Estimation of parameters and eigenmodes of multivariate autoregressive models. ACM Transactions on Mathematical Software (TOMS), 27(1):27 57, 2001.
44. Lars Onsager. Crystal statistics. i. a two-dimensional model with an order-disorder transition. Physical Review, 65(3-4):117, 1944.

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔