(3.238.186.43) 您好!臺灣時間:2021/03/05 22:50
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:劉威成
研究生(外文):Wei-Cheng Liu
論文名稱:將資源使用分析應用於軟體測試
論文名稱(外文):Applying Resource Usage Analysis to Software Testing
指導教授:鄭炳強鄭炳強引用關係陳嘉玫陳嘉玫引用關係
指導教授(外文):Bing-Chiang JengChia-Mei Chen
學位類別:碩士
校院名稱:國立中山大學
系所名稱:資訊管理學系研究所
學門:電算機學門
學類:電算機一般學類
論文種類:學術論文
論文出版年:2007
畢業學年度:95
語文別:英文
論文頁數:62
中文關鍵詞:資源導向測試記憶體重複釋放記憶體洩漏軟體測試
外文關鍵詞:resource-oriented testingdouble freesoftware testingmemory leak
相關次數:
  • 被引用被引用:0
  • 點閱點閱:112
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:1
隨著軟體與網路的發展,現今軟體越趨複雜,許多利用軟體弱點的攻擊手法使得軟體測試面臨嚴峻的挑戰,根據CSI/FBI的調查報告顯示阻斷式服務攻擊(Denial-of-Service attack)在過去三年中名列前五名造成重大損失的網路攻擊,除了消耗頻寬的攻擊方式之外,利用軟體資源使用的弱點往往是攻擊者最常使用的攻擊手法,本篇研究發現傳統的軟體測試方法並無法有效的發現軟體弱點,其原因在於傳統測試方法的目的在發現程式的邏輯的正確性,這樣的思維模式造成許多並不屬於邏輯問題的漏洞無法被發現,例如,記憶體洩漏(memory leak)。
另一方面許多為解決軟體資源耗用漏洞的新測試方法也一一被提出,但是這些測試方法所呈現的結果卻相當的原始,因此,本研究嘗試從資源耗用的角度為軟體測試加上一個新的定義,並且提出3個測試準則,測試人員可以結合測試準則與目前現有工具對程式資源耗用情況進行測試,透過這些測試準則的指引將可以更有效的找出程式在資源耗用上不健康之行為。
With the developing of the software and network environment, software becomes more and more complex. The network attacks which exploit the software vulnerability make the traditional software testing face a crucible challenge. According to the report by the CSI/FBI, the lose cause from Denial-of-Service remains in top 5 highest rank of network attacks in the past 3 years. Besides the network bandwidth consuming, the commonest attack is to exploit the software vulnerabilities. In my research, I found the traditional testing technique could not find the software vulnerabilities efficiently for they just verify the correctness of software. This way of thinking would bypass many software vulnerabilities which do not belong to the logical error such as memory leak.
In another way, some test techniques to solve the resource usage vulnerability were proposed in recent years but the results of them are very primitive. Thus, I try to give the software testing a new definition from the resource usage analysis. I propose 3 test criteria in this paper. Testers could combine these test criteria with existing tools as a guide to test the resource usage of the program. With these test criteria, testers can find out the unhealthy usage of software resource.
1. Introduction 1
1.1. The problem of existing methods 4
1.2. The importance of this work 5
1.3. Paper organization 6
2. Related Works 7
2.1. Software testing 7
2.1.1. Structural Testing 8
2.1.1.1. Statement Coverage 8
2.1.1.2. Branch Coverage 9
2.1.1.3. Basic Path Coverage 10
2.1.2. Functional testing 10
2.1.3. Performance testing 11
2.1.4. Load testing 12
2.1.5. Stress testing 12
2.2. Software aging 13
2.3. Memory management dilemma 13
2.3.1. The memory management solutions 14
2.3.1.1. Garbage Collection(reference counting、tracing collector) 15
2.3.1.2. Dynamic testing 16
2.3.1.3. Static analysis 18
3. Resource-Oriented testing 19
3.1. The memory management problems in C language 19
3.1.1. Buffer overflow 19
3.1.2. Memory leak 20
3.1.3. Double free 21
3.2. Resource usage cycle 22
3.3. The functions related to memory usage life cycle 23
3.3.1. Create Function and Dispose Function 23
3.3.2. Modify function 24
3.4. The difference between logical testing and resource-oriented testing 25
3.4.1. Unobvious error feature 25
3.4.2. Reduced flow graph and path amount 26
3.4.3. Memory effect testing:Single test run & multiple test run 26
3.5. Resource-oriented test criteria 28
3.5.1. Basic test criterion 29
3.5.2. Create-dispose-free pair test criterion 31
3.6. Handling memory blocks among the functions 32
3.7. Combine the test technique with the criteria 33
4. Experiment Result 36
4.1. Experiment process 37
4.2. Memory leak patterns and case analysis 38
4.3. Analysis of double free case 44
4.4. Special case 46
4.5. Path complexity analysis 47
5. Conclusion 49
6. Research Constraint and Future work 50
7. Reference 51
1.Gelperin, D. and B. Hetzel, The growth of software testing. 1988, ACM Press. p. 687-695.
2.Myers, G.J., et al., The Art of Software Testing. 2004: John Wiley and Sons.
3.Hetzel, B., The complete guide to software testing. 1988: QED Information Sciences, Inc. Wellesley, MA, USA.
4.United States Department of Justice. [cited; Available from: http://www.usdoj.gov/criminal/cybercrime/index.html.
5.William R. Bush, J.D.P.D.J.S., A static analyzer for finding dynamic programming errors. 2000. p. 775-802.
6.Dawson, E., et al., Bugs as deviant behavior: a general approach to inferring errors in systems code, in Proceedings of the eighteenth ACM symposium on Operating systems principles. 2001, ACM Press: Banff, Alberta, Canada.
7.Hastings, R. and B. Joyce, Purify: Fast detection of memory leaks and access errors. 1992. p. 125-136.
8.Valgrind. [cited; Available from: http://www.valgrind.org/.
9.David, E., Static detection of dynamic memory errors. 1996, ACM Press. p. 44-53.
10.Chen, P.-K., An Automated Method for Resource Testing. 2006.
11.Beizer, B., Software testing techniques. 1990: Van Nostrand Reinhold Co. New York, NY, USA.
12.Ntafos, S.C., A comparison of some structural testing strategies. Software Engineering, IEEE Transactions on, 1988. 14(6): p. 868-874.
13.Apache foundation. [cited; Available from: http://www.apache.org/.
14.joedog.org. [cited; Available from: http://www.joedog.org/.
15.httperf home page. [cited; Available from: http://www.hpl.hp.com/research/linux/httperf/.
16.Standard Performance Evaluation Corporation. [cited; Available from: http://www.spec.org/.
17.Transaction Processing Performance Council. [cited; Available from: http://www.tpc.org/.
18.McGee, P. and C. Kaner, Experiments with High Volume Test Automation.
19.Kaner, C., W. Bond, and P. McGee, High Volume Test Automation. 2004.
20.Berndt, D.J. and A. Watkins, High Volume Software Testing using Genetic Algorithms. 2005, IEEE Computer Society Washington, DC, USA.
21.Briand, L., Y. Labiche, and M. Shousha, Using genetic algorithms for early schedulability analysis and stress testing in real-time systems. Genetic Programming and Evolvable Machines, 2006. 7(2): p. 145-170.
22.Jian Zhang, S.C.C., Automated test case generation for the stress testing of multimedia systems. 2002. p. 1411-1435.
23.Garg, S., et al. A methodology for detection and estimation of software aging. 1998.
24.Huang, Y., et al. Software Rejuvenation: Analysis, Module and Applications. 1995.
25.Vaidyanathan, K. and K.S. Trivedi. A measurement-based model for estimation of resource exhaustion in operational software systems. 1999.
26.Chillarege, R., S. Biyani, and J. Rosenthal, Measurement of Failure Rate in Widely Distributed Software. 1995. p. 424–433.
27.Iyer, R.K. and D.J. Rossetti, Effect of System Workload on Operating System Reliability: A Study on IBM 3081. Software Engineering, IEEE Transactions on, 1985. SE-11(12): p. 1438-1448.
28.Lee, I., R.K. Iyer, and A. Mehta. Identifying software problems using symptoms. 1994.
29.Iyer, R.K., L.T. Young, and P.V.K. Iyer, Automatic recognition of intermittent failures: an experimental study of field data. Computers, IEEE Transactions on, 1990. 39(4): p. 525-537.
30.Tang, D. and R.K. Iyer, Dependability measurement and modeling of a multicomputer system. Computers, IEEE Transactions on, 1993. 42(1): p. 62-75.
31.Thakur, A. and R.K. Iyer, Analyze-NOW-an environment for collection and analysis of failures in a network of workstations. Reliability, IEEE Transactions on, 1996. 45(4): p. 561-570.
32.Common Vulnerabilities and Exposures. [cited; Available from: http://cve.mitre.org/.
33.Cowan, C., et al., StackGuard: Automatic Adaptive Detection and Prevention of Buffer-Overflow Attacks.
34.TWCERT/CC. [cited; Available from: http://www.cert.org.tw/.
35.Jones, R. and R. Lins, Garbage collection: algorithms for automatic dynamic memory management. 1996: John Wiley & Sons, Inc. New York, NY, USA.
36.Stephen, M.B. and S.M. Kathryn, Ulterior reference counting: fast garbage collection without a long wait, in Proceedings of the 18th annual ACM SIGPLAN conference on Object-oriented programing, systems, languages, and applications. 2003, ACM Press: Anaheim, California, USA.
37.Erickson, C., Memory leak detection in embedded systems. 2002, Specialized Systems Consultants, Inc. Seattle, WA, USA.
38.Evans, D., et al., LCLint: a tool for using specifications to check code. 1994, ACM Press New York, NY, USA. p. 87-96.
39.Cowan, C., et al., PointGuardTM: Protecting Pointers From Buffer Overflow Vulnerabilities.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔