跳到主要內容

臺灣博碩士論文加值系統

(35.172.223.30) 您好!臺灣時間:2021/07/25 12:15
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:盧立偉
研究生(外文):Lu, LiWei
論文名稱:深度學習網路之研究及其應用
論文名稱(外文):Preparing Deep Belief Networks for Practical Tasks
指導教授:許宏銘許宏銘引用關係
指導教授(外文):Dr. N. Michael, Mayer
口試委員:林惠勇李祖聖許宏銘
口試委員(外文):Dr. Lin, Huei-YungDr. Li, Tzuu-HsengDr. N. Michael, Mayer
口試日期:2012-07-04
學位類別:碩士
校院名稱:國立中正大學
系所名稱:電機工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2012
畢業學年度:100
語文別:英文
論文頁數:38
中文關鍵詞:深度學習網路回聲網路
外文關鍵詞:Deep Belief NetworksEcho State NetworksArtificial Dreaming
相關次數:
  • 被引用被引用:2
  • 點閱點閱:1569
  • 評分評分:
  • 下載下載:232
  • 收藏至我的研究室書目清單書目收藏:1
Deep Belief Networks(DBNs) 是多層架構的機率生成模型,此網路可以透過多層的架構學習各種輸入, 如灰階圖片、彩色圖片、聲音資料或動態影像等等。DBNs由 Restricted Boltzmann Machines 組成, 其單元可以是二元的、高斯的或其他指數單元(units), 在本篇論文的第一個實驗中更進一步檢視 DBNs 對於實數表示成二位元的學習能力,其學習能力我們透過學習常態分佈 Poisson 分佈及學習亂數產生器所產生的機率分佈來驗證。透過此項實驗我們更進一步嘗試以二位元對多個類別中的物品進行編碼,並訓練網路辨識數個物品的出現其標籤屬於何種類別。再最後一個實驗中, 我們處理了更實際的情況,有些時候當下的決策或預測需要依賴過去歷史中的決定。這個實驗組合了 Deep Belief Networks 和 Echo State Networks(ESNs)成一個全新的 Model, 我們利用 ESNs 中的 Reservoir, 一種遞迴神經網路(Recurrent Neural Networks), 來當作大腦中的海馬迴處理短暫的記憶, 而DBNs 扮演大腦皮質的功能處理並預測 ESNs 傳來的資訊。我們這個 Model 及過程稱作 Artificial Dreaming.

Deep Belief Networks (DBNs) is a probabilistic generative models composed of
multiple layers of stochastic, latent variables. multiple layers of stochastic, latent variables. The network can learn many layers of features on various type of data such as binary images, gray scaled images, color images and acoustic data. This paper further examined the ability of DBNs to interpret the binary representation of data. The performance is validated by learning given distributions such as normal distribution, Poisson distribution and random number generator. We have shown that Deep Believe Networks can successfully learn the probability distribution with binary encoded dataset. With this property, we can further extend DBNs into states or properties prediction application, we will provide an example showing that DBNs can take multiple binary encoded parameters as input vector and predict the belonging category of these input. Generally, the sensory input of DBNs contains information belong to a certain timestep, that is, the prediction depends only on the current input. However, in some practical tasks, prediction often depend not only on the current state but also the history of states. We propose a method combining DBNs with Echo State Networks(ESNs), using the properties of ESNs’ reservoir, a type of Recurrent Neural Networks, to encoded the history of previous states in which gives us an idea of artificial dreaming.

ACKNOWLEDGMENTS......................................1
中文摘要..............................................2
ABSTRACT.............................................3
TABLE OF CONTENTS....................................4
LIST OF FIGURES......................................5
LIST OF TABLES.......................................6

I.INTRODUCTION........................................7
1.1 Motivation.......................................7
1.2 Objective........................................7
1.3 Chapters Arrangement.............................8

II.BACKGROUND KNOWLEDGE REVIEW........................10
2.1 Deep Belief Networks.............................10
2.2 Echo State Networks..............................14

III.IMPLEMENTATION AND RESULTS........................17
3.1 Three Practical Examples.........................17
3.2 DBNs on Learning Binary Encoded Data.............17
3.3 Testing Multiple Inputs with Deep Belief Networks..26
3.4 Combining Deep Belief Networks with Echo State Networks..29

IV.SUMMARY AND CONCLUSIONS............................33

REFERENCES............................................34
VITA..................................................37

[1] T. Lee and D. Mumford, Hierarchical Bayesia inference in the visual cortex, J. Opt. Soc. Amer., vol.20, pt.7, pp.1434-1448, 2008
[2] Herbert Jaeger and Harald Haas. Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication. Science 2 April 2004: Vol. 304. no. 5667, pp. 78 80 doi:10.1126/science.1091277
[3] Herbert Jaeger (2007) Echo State Network. Scholarpedia.
[4] Hinton, G. E., Osindero, S. and Teh, Y. A fast learning algorithm for deep belief nets, Neural Comp., Vol. 18, No. 7. (1 July 2006), pp. 1527-1554
[5] Yann LeCun, Corinna Cortes, THE MNIST DATABASE of handwritten digits,
http://yann.lecun.com/exdb/mnist/
[6] Mohamed, A. R., Dahl, G. E. and Hinton, G. E., Deep belief networks for phone recognition, NIPS 22 workshop on deep learning for speech recognition, 2009
[7] Fu Jie Huang, Yann LeCun, THE NORB DATASET of 3D object recognition from shape, Courant Institute, New York University, http://www.cs.nyu.edu/ylclab/data/norb-v1.0/ , 2004
[8] Ranzato, M., Krizhevsky, A. and Hinton, G. E., Factored 3-way restricted Boltzmann machines for modeling natural images, Proc. Thirteenth International Conference on Artificial Intelligence and Statistics, 2010
[9] G. E. Hinton, Training products of experts by minimizing contrastive divergence, Neural Computation, vol. 14, pp. 1771-1800, 2002
[10] Hopfield, J. J. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences, Vol. 79, 1982, pp. 2554-2558.
[11] G. E. Hinton, A practical guide to training restricted Boltzmann machines, 2010
[12] G. E. Hinton, Scholarpedia, 4(5):5947. http://www.scholarpedia.org/article/Deep belief networks, (2009)
[13] IEEE Computer Society (August 29, 2008), IEEE Standard for Floating-Point Arithmetic, IEEE, doi:10.1109/IEEESTD.2008.4610935, IEEE Std 754-2008
[14] Arel, I.; Rose, D.C.; Karnowski, T.P.; , Deep Machine Learning - A New Frontier in Artificial Intelligence Research [Research Frontier], Computational Intelligence Magazine, IEEE , vol.5, no.4, pp.13-18, Nov. 2010, doi: 10.1109/MCI.2010.938364
[15] T. Lee and D. Mumford, R. Romero, and V. Lamme, “The role of the primary visual cortex in higher level vision,” Vision Res., vol. 38, pp. 2429-2454, 1998.
[16] Jaeger H. (2001a) The "echo state" approach to analysing and training recurrent neural networks. GMD Report 148, GMD - German National Research Institute for Computer Science
[17] Maass W., Natschlaeger T., and Markram H. (2002) Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation, 14(11):2531-2560.
[18] M. Welling, M. Rosen-Zvi, and GE Hinton, “Exponential family harmoniums with an appli- cation to information retrieval,” in Proc. NIPS, 2005.
[19] Hobson, J.A.; McCarley, R. 1977. "The brain as a dream state generator: an activation-synthesis hypothesis of the dream process". American Journal of Psychiatry, 134, 1335–1348.
[20] Zhang, Jie (2004). Memory process and the function of sleep (6-6 ed.). Journal of Theoretics. Retrieved 2006-03-13.
[21] Tarnow, Eugen (2003). How Dreams And Memory May Be Related (5(2) ed.). NEURO-PSYCHOANALYSIS.
[22] R. Stickgold, J.A. Hobson, R. Fosse, M. Fosse1 (November 2001). "Sleep, Learning, and Dreams: Off-line Memory Reprocessing". Science 294 (5544): 1052–1057. DOI:10.1126/science.1063530. PMID 11691983.
[23] Evans, C.; Newman, E. (1964). "Dreaming: An analogy from computers". New Scientist 419: 577–579.
[24] Crick, F.; Mitchison, G. (1983). "The function of dream sleep". Nature 304 (5922): 111–114. DOI:10.1038/304111a0. PMID 6866101.

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top