跳到主要內容

臺灣博碩士論文加值系統

(100.28.227.63) 您好!臺灣時間:2024/06/22 02:48
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:柯宜室
研究生(外文):Is-Shih, Ko
論文名稱:使用壓縮演算法檢測兩時間序列之間資訊流方向的新方法
論文名稱(外文):A new method to detect the direction of Information Flow between two-time series using a compression algorithm
指導教授:陳志強陳志強引用關係
指導教授(外文):Chi-Keung Chan
學位類別:碩士
校院名稱:國立中央大學
系所名稱:物理學系
學門:自然科學學門
學類:物理學類
論文種類:學術論文
論文出版年:2023
畢業學年度:111
語文別:英文
論文頁數:104
中文關鍵詞:轉移熵資訊理論因果關係
外文關鍵詞:transfer entropyinformation theorycausality
相關次數:
  • 被引用被引用:0
  • 點閱點閱:26
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
了解複雜系統中因果關係的重要一環是如何檢測資訊流的方向。轉移熵(transfer entropy)被視為一種檢測資訊流的方法。然而,傳統方法中使用轉移熵需要事先確定歷史 長度,而不同的歷史長度可能導致截然相反的結果。尤其對於預測系統而言,為了獲得正確的資訊流方向,轉移熵需要更長的歷史長度。然而,過長的歷史長度可能導致轉移熵出 現巨大偏差,使結果難以解讀。在本研究中,我們提出了一種基於壓縮演算法估計轉移熵的新方法,稱之為壓縮轉移熵。這種方法可以在不需要事先指定歷史長度的情況下檢測資訊流方向。我們建立了兩個具有單一資訊流方向的模型,並使用真實的斑馬魚數據進行測試,以評估壓縮轉移熵的檢測能力,同時與傳統轉移熵方法進行比較。研究結果表明,壓縮轉移熵相較於傳統方法,在不需要指定歷史長度的情況下能夠準確地獲得結果。
Understanding the causality relationships in complex systems is an important aspect of determining the direction of information flow. Transfer entropy is regarded as a method for detecting information flow. However, traditional approaches using transfer entropy require determining the history length beforehand, and different history lengths can lead to contradictory results. This is particularly challenging for anticipation systems, as transfer entropy requires longer history lengths to obtain the correct direction of information flow. However, excessively long history lengths can introduce significant bias in transfer entropy, making the results difficult to interpret. In this study, we propose a new method called ”compressed transfer entropy” that estimates transfer entropy based on compression algorithms. This method enables the detection of information flow direction without the need for specifying a history length in advance. Two models with unidirectional information flow, along with real-world zebrafish data, were utilized to evaluate the detection capability of compressed transfer entropy, comparing it with traditional transfer entropy methods. The results demonstrate that compressed transfer entropy provides accurate results without the requirement of specifying a historical length.
摘要 xi
Abstract xiii
Acknowledgement xv
Contents xvii
List of Figures xxi
Glossary xxix

1 Introduction 1

2 Information Tools 5
2.1 Shannon Entropy.............................................................. 5
2.2 Mutual Information ........................................................... 7
2.3 Time-Delayed Mutual Information ........................................... 8
2.4 Binning Method............................................................... 10
2.5 Transfer entropy............................................................... 11
2.5.1 Entropy rate .............................................................. 12
2.5.2 Active Information Storage ............................................... 14
2.5.3 Components of entropy ................................................... 15
2.5.4 Sampling disaster ......................................................... 17
2.6 Data Compression............................................................. 18

3 Material and Method 21
3.1 Experimental setup ........................................................... 21
3.1.1 Animals ................................................................... 21
3.1.2 Hardwares................................................................. 22
3.1.3 Software................................................................... 23
3.1.4 Data recording ............................................................ 24
3.2 Simulation models............................................................. 25
3.2.1 Clone series ............................................................... 25
3.2.2 Logistic map .............................................................. 25
3.3 Information analysis .......................................................... 27
3.3.1 Binning method........................................................... 27
3.3.2 Time-Delay mutual information (TDMI) ................................ 28
3.3.3 Entropy rate .............................................................. 28
3.3.4 Transfer entropy .......................................................... 30

4 Result 31
4.1 Compression Transfer entropy ................................................ 31
4.1.1 Negative value ............................................................ 34
4.1.2 Entropy rate .............................................................. 35
4.2 Simulation..................................................................... 36
4.2.1 Clone...................................................................... 36
4.2.2 Logistic Map .............................................................. 40
4.3 Experiment.................................................................... 45
4.3.1 Pair A..................................................................... 45
4.3.2 Pair B ..................................................................... 46
4.3.3 Pair C ..................................................................... 48

5 Conclusion and Discussion 51
5.1 Direction of information flow ................................................. 51
5.2 Simulation..................................................................... 52
5.3 Experiment.................................................................... 53
5.4 Discussion: Negative value of cTE............................................ 54
5.5 Discussion: Comparison between TE......................................... 55
5.5.1 Future Work .............................................................. 55

Bibliography 57

A Logistic Map without noise 61
A.1 Compression Entropy rate .................................................... 61
A.2 Transfer entropy............................................................... 61

B Logistic Map with noise 65
B.1 Compression Entropy rate .................................................... 65
B.2 Transfer entropy............................................................... 65

C Python code 69
[1] Norbert Wiener. “The Theory of Prediction”. In: Modern Mathematics for the Engineer: First Series. Ed. by E F Beckenbach. New York: McGraw-Hill, 1956.
[2] C W J Granger. “Investigating Causal Relations by Econometric Models and Cross- spectral Methods”. In: Econometrica 37.3 (1969), pp. 424–438.
[3] T Schreiber. “Measuring information transfer”. In: Phys Rev Lett 85.2 (2000), pp. 461– 464.
[4] S Kullback. Information Theory and Statistics. Wiley, 1959.
[5] L Barnett, A B Barrett, and A K Seth. “Granger causality and transfer entropy are
equivalent for Gaussian variables”. In: Phys Rev Lett 103.23 (2009), p. 238701.
[6] Can-Zhong Yao and Hong-Yu Li. “Effective Transfer Entropy Approach to Information Flow Among EPU, Investor Sentiment and Stock Market”. en. In: Frontiers in Physics 8 (June 2020).
[7] M Wibral et al. “Measuring information-transfer delays”. In: PLoS One 8.2 (2013), e55809.
[8] Michael Wibral and Raul Vicente. Directed Information Measures in Neuroscience. Un-
derstanding Complex Systems. 2014.
[9] S Butail, V Mwaffo, and M Porfiri. “Model-free information-theoretic approach to infer
leadership in pairs of zebrafish”. In: Phys Rev E 93 (Apr. 2016), p. 042411.
[10] S Butail et al. “Information Flow in Animal-Robot Interactions”. en. In: Entropy 16.3
(Mar. 2014), pp. 1315–1330.
[11] M Porfiri and M Ruiz Marin. “Symbolic dynamics of animal interaction”. In: J Theor Biol 435 (2017), pp. 145–156.
[12] X S Liang. “Unraveling the cause-effect relation between time series”. In: Phys Rev E Stat Nonlin Soft Matter Phys 90.5-1 (Nov. 2014), p. 052150.
[13] Claude Elwood Shannon. “A mathematical theory of communication”. In: The Bell system technical journal 27.3 (1948), pp. 379–423.
[14] Kozachenko, L And Leonenko. “On statistical estimation of entropy of a random vector”. In: Problems In-formation Transmission (1987).
[15] Joseph T Lizier. “JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems”. en. In: Frontiers in Robotics and AI 1 (Dec. 2014).
BIBLIOGRAPHY
[16] L Paninski. “Estimation of entropy and mutual information”. en. In: Neural Computation 15.6 (June 2003), pp. 1191–1253.
[17] M Staniek and K Lehnertz. “Symbolic transfer entropy”. In: Phys Rev Lett 100.15 (2008), p. 158101.
[18] Liangyue Cao. “Practical method for determining the minimum embedding dimension of a scalar time series”. In: Physica D: Nonlinear Phenomena 110.1 (Dec. 1997), pp. 43–50.
[19] Michael Lindner et al. “TRENTOOL: A Matlab open source toolbox to analyse informa- tion flow in time series data with transfer entropy”. In: BMC Neuroscience 12.1 (Nov. 2011), p. 119.
[20] T M Cover and J A Thomas. Elements of Information Theory. Wiley, 2012.
[21] Andrea Baronchelli, Emanuele Caglioti, and Vittorio Loreto. “Measuring complexity with
zippers”. In: European Journal of Physics 26.5 (July 2005), S69.
[22] R Avinery, M Kornreich, and R Beck. “Universal and Accessible Entropy Estimation Using
a Compression Algorithm”. In: Phys Rev Lett 123.17 (2019), p. 178102.
[23] F N M de Sousa, V G P de Sa, and E Brigatti. “Entropy estimation in bidimensional
sequences”. In: Physical Review E 105.5 (May 2022).
[24] Mickael Zbili and Sylvain Rama. “A Quick and Easy Way to Estimate Entropy and Mutual
Information for Neuroscience”. en. In: Frontiers in Neuroinformatics 15 (June 2021).
[25] Henning U Voss. “Anticipating chaotic synchronization”. In: Physical Review E 61.5 (May
2000), pp. 5115–5119.
[26] D W Hahs and S D Pethel. “Distinguishing anticipation from causality: anticipatory bias
in the estimation of information flow”. In: Phys Rev Lett 107.12 (2011), p. 128701.
[27] D A Smirnov. “Spurious causalities with transfer entropy”. In: Phys Rev E Stat Nonlin
Soft Matter Phys 87.4 (Apr. 2013), p. 042917.
[28] lzma. https://tukaani.org/xz/.
[29] Peter Deutsch. “Request for comments: 1951”. In: DEFLATE Compressed Data Format Specification version 1.3. URL= ftp://ftp. uu. net/graphics/png/documents/zlib/zdoc- index. html (1996).
[30] J Ziv and A Lempel. “UNIVERSAL ALGORITHM FOR SEQUENTIAL DATA COM- PRESSION”. In: Ieee Transactions on Information Theory 23.3 (1977), pp. 337–343.
[31] James Stone. Information Theory: A Tutorial Introduction. 2015.
[32] J A Vastano and H L Swinney. “Information transport in spatiotemporal systems”. In:
Phys Rev Lett 60.18 (1988), pp. 1773–1776.
[33] Terry Bossomaier et al. An Introduction to Transfer Entropy. 2016.
[34] C J Cellucci, A M Albano, and P E Rapp. “Statistical validation of mutual information calculations: comparison of alternative numerical algorithms”. In: Phys Rev E Stat Nonlin Soft Matter Phys 71.6 Pt 2 (June 2005), p. 066208.
[35] J T Lizier, M Prokopenko, and A Y Zomaya. “Local measures of information storage in complex distributed computation”. en. In: Information Sciences 208 (2012), pp. 39–54.
[36] A Avdesh et al. “Regular care and maintenance of a zebrafish (Danio rerio) laboratory: an introduction”. In: J Vis Exp 69 (2012), e4196.
[37] F Romero-Ferrero et al. “idtracker.ai: tracking all individuals in small or large collectives of unmarked animals”. In: Nat Methods 16.2 (Feb. 2019), pp. 179–182.
[38] pyinform. http://elife-asu.github.io/PyInform/shannon.html.
[39] R E Engeszer et al. “Zebrafish in the wild: a review of natural history and new notes from
the field”. In: Zebrafish 4.1 (2007), pp. 21–40.
電子全文 電子全文(網際網路公開日期:20240701)
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top