(3.237.178.91) 您好!臺灣時間:2021/03/04 09:01
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:丁美彤
研究生(外文):TING, MEI-TONG
論文名稱:應用機器學習與空間之鄰域特徵於房價預測
論文名稱(外文):The Application of Machine Learning with Neighborhood Feature of Space for House Price Prediction
指導教授:李文毅李文毅引用關係
指導教授(外文):LEE, WEN-YI
口試委員:鄭宇翔許育峯
口試日期:2020-07-04
學位類別:碩士
校院名稱:世新大學
系所名稱:財務金融學研究所(含碩專班)
學門:商業及管理學門
學類:財務金融學類
論文種類:學術論文
論文出版年:2020
畢業學年度:108
語文別:中文
論文頁數:57
中文關鍵詞:機器學習空間之鄰域特徵房價預測
外文關鍵詞:Machine LearningSpatial FactorsHouse Price Prediction
相關次數:
  • 被引用被引用:0
  • 點閱點閱:67
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:12
  • 收藏至我的研究室書目清單書目收藏:0
在傳統有土斯有財的觀念下,房地產因具備不可移動性、不可替代性(差異性)、耐久性、增值性、投資與消費雙重性(用途多樣性)等特性。準確的房價預測有助於房市的交易,並提供相關房地產市場的參與者一個客觀評估屋價的標準。傳統預測房價的研究大多以廻歸模型為主,並考慮總體財經因素、時間因素、房屋條件個別因素及購屋者本身等。然而,空間之鄰域特徵的重要性,一直是過去研究所忽略的。因此,本研究在考慮空間之鄰域特徵的狀況下,使用現今熱門的機器學習演算法(Regression、Ridge Regression、Random Forest、Support Vector Machine、Adaptive Boosting、Gradient Boosting),進行房價預測。本研究的研究對象主要以台北市房市為主,將12個行政區的統計指標作為鄰域特徵,驗證鄰域特徵的重要性。本研究最終證實,空間要素確實可以有效的提升房價的預測準確度。
Real estate is featured with immovability, irreplaceability (difference), durability, and value-added as wealth for investment and consumption (diversity of uses). The accurate prediction of house prices has become crucial since the objective evaluation of the house price helps market growth. Most of the house price prediction models are regression and considering the overall economic, time, and individual factors, such as buyer and seller. However, the characteristics of the spatial are neglected. In this study, we consider the influence of the spatial factors with the machine learning algorithms (Regression、Ridge Regression、Random Forest、Support Vector Machine、Adaptive Boosting、Gradient Boosting) for house price prediction. We focus on the house price in Taipei Taiwan and generate the spatial factors according to twelve administrative districts in Taipei City. The experimental results state the spatial factors could improve the accuracy of house price prediction.
謝誌 I
摘要 II
Abstract III
目錄 IV
圖目錄 V
表目錄 VI
第 1 章 緒論 1
1-1 研究背景與動機 1
1-2 研究目的 3
1-3 研究貢獻 3
1-4 研究架構與流程 3
第 2 章 文獻回顧 5
2-1 不動產預測文獻回顧 5
2-2 不動產價格的相關影響變數之文獻回顧 14
2-3 機器學習與深度學習技術分析文獻回顧 20
第 3 章 研究方法 27
3-1 研究資料來源與分析 27
3-2 研究架構 34
第 4 章 實證分析 40
4-1 實驗一不考量空間因素之預測結果分析 40
4-2 實驗二考量空間因素進行預測結果分析 41
4-3 實驗結果比較 42
第 5 章 結論與建議 44
5-2 結論 44
5-2 未來建議 44
參考文獻: 46


圖目錄

圖 1 研究架構圖流程圖 4

圖 2 散點圖展示了線性支持向量機核函數的決策邊界 22

圖 3 SVM 最大化邊際以進行分類 23

圖 4 線性SVM的soft margin loss設置 24

圖 5 AdaBoost 模型訓練過程中權重的調整 24

圖 6 Gradient Boosting的誤差調整過程 26

圖 7 臺北市107年底土地面積 30

圖 8 實驗設計圖 37


表目錄

【表 2-1】不動產預測文獻回顧 6

【表 2-2】不動產價格的相關影響變數之文獻回顧 16

【表 3-1】實價登錄房價特徵項目 29

【表 3-2】實價登錄房價特徵增加項目 29

【表 3-3】本研究整理主計處105年資料房價特徵項目 31

【表 3-4】變數代號說明 33

【表 3-5】實驗一與實驗二之自變數代號說明 35

【表 4-1】實驗一:不考慮鄰近空間要素之預測結果分析 40

【表 4-2】實驗二:考慮鄰近空間要素之預測結果分析 41

【表 4-3】實驗平均結果分析 43




1. 臺北市地價問題研究,人人文庫,台北市, (p.49-p.105), (1980).

2. 顏聰玲: 不動產市場分析與預測試(初版) ,新文京開發出版, (2004).

3. Frew J., and G. D. Jud, 2003, “Estimating The Value of Apartment Buildings”, The Journal of Real Estate Research, 25(1): 77 - 86.

4. Calhoun C. A., 2003, “Property Valuation Models and House Price Indexes for The Provinces of Thailand: 1992 – 2000”, Housing Finance International, 17(3): 31 – 41

5. Limsombunchai. House Price Prediction: Hedonic Price Model vs. Artificial Neural Network. 2004.

6. Pow, N., Janulewicz, E. & Liu, L. (2014). Applied Machine Learning Project 4 Prediction of real estate property prices in Montreal, Available:
http://rl.cs.mcgill.ca/comp598/fall2014/comp598_submission_99.pdf.

7. Bradford Case et. al.: Machine Learning: Modeling Spatial and Temporal House Price Patterns: A Comparison of Four Models, 2004.

8. Abdul G. Sarip1 and Muhammad Burhan Hafez. Fuzzy Logic Application for House Price Prediction. 2015.

9. Jiaoyang Wu. Housing Price prediction Using Support Vector Regression. 2017.

10. MSendhil Mullainathan and Jann Spiess: Machine Learning: An Applied Econometric Approach, 2017.

11. Kahn, J.: What drives housing prices? Federal Reserve Bank of New York Staff Reports, New York, USA, (2008).

12. Lowrance, E.R.: Predicting the market value of single-family residential real estate. 1st edn. PhD diss., New York University, (2015).

13. Bork, M., Moller, V.S.: House price forecast ability: a factor analysis. Real Estate Economics. Heidelberg (2016).

14. David Stadelmann. Which factors capitalize into house prices? A Bayesian averaging approach. 2010.

15. Pardoe, I.: Modeling home prices using realtor data. 16(2), 1-9 (2008).

16. Limsombunchao, V., House price prediction: hedonic price model vs. artificial neural network. Lincoln University, NZ, (2004).

17. MSendhil Mullainathan and Jann Spiess, Machine Learning: An Applied Econometric Approach. 2017.

18. Ayush Varma, Abhijit Sarma, Sagar Doshi and Rohini Nair. House Price Prediction Using Machine Learning And Neural Networks. 2018

19. Adamantios Ntakaris1, Giorgio Mirone2, Juho Kanniainen1, Moncef Gabbouj 1, and Alesandros Iosofodois., Feature Engineering for Mid-Price Prediction With Deep Learning. 2019.

20. Cheng Fana, Yongjun Sunb, Yang Zhaoc,⁎, Mengjie Songd, Jiayuan Wanga., Deep learning-based feature engineering methods for improved building energy prediction. 2019.

21. Susan Athey, Mohsen Bayati, Guido Imbens, and Zhaonan Qu., Ensemble Methods for Causal Effects in Panel Data Settings. 2019.

22. Guido Imbens and Khashayar Khosravi. Matrix Completion Methods for Causal Panel Data Models. 2018.

23. Heng Shi, Minghao Xu, and Ran Li, Member, Deep Learning for Household Load Forecasting—A Novel Pooling Deep RNN. 2018.

24. Maryam M Najafabadi1, Flavio Villanustre2, Taghi M Khoshgoftaar1,Naeem Seliya1, Randall Wald1* and Edin Muharemagic3, Deep learning applications and challenges in big data analytics. 2015.

25. Wei-Yin Loh, Classification and regression tree 2011.

26. Wei-Yin Loh, Random Decision Forests. 1995.

27. Robin Genuer, Jean-Michel Poggi, Christine Tuleau-Malot, Variable selection using Random Forests. 2012.

28. Leo Breiman, Random Forests. 2001.

29. J.A.K. SUYKENS and J. VANDEWALLE, Least Squares Support Vector Machine Classifiers. 1999.

30. Aurlien Gron, Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems (1st. ed.). O’Reilly Media, Inc.2017.

31. Kim, S. J., Koh, K., Lustig, M., Boyd, S., & Gorinevsky, D, An interior-point method for large-scale $\ell_1 $-regularized least squares. IEEE journal of selected topics in signal processing, 1(4), 606-617. 2007.

32. Tin Kam Ho. Random decision forests. Proceedings of 3rd International Conference on Document Analysis and Recognition (Montreal, Que., Canada: IEEE Comput. Soc. Press). 1995. 1: 278–282.

33. Tin Kam Ho, The random subspace method for constructing decision forests. Expert systems with Applications, IEEE Transactions on Pattern Analysis and Machine Intelligence. Aug./1998, 20 (8): 832–844.

34. Ho, Tin Kam, A Data Complexity Analysis of Comparative Advantages of Decision Forest Constructors, Pattern Analysis and Applications. 2002: p.102–112 [2019-02-16].

35. Leo Breiman, RANDOM FORESTS. Statistics Department University of California Berkeley, CA 94720, 2001.

36. Hastie., et al. The Elements of Statistical Learning. 2008. ISBN 0-387-95284-5.

37. Freund, Y. & Schapire, R. (1996). Experiments with a new boosting algorithm, Machine Learning: Proceedings of the Thirteenth International Conference, 148–156

38. Breiman, L. (1996a). Bagging predictors. Machine Learning 26(2), 123–140.

39. Dietterich, T.(1998). An experimental comparison of three methods for constructing ensembles ofdecision trees: Bagging, boosting and randomization, Machine Learning,1–22.

40. Breiman, L. 1999. Using adaptive bagging to debias regressions. Technical Report 547, Statistics Dept. UCB.

41. Ho, T. K. (1998). The random subspace method for constructing decision forests. IEEE Trans. On Pattern Analysis and Machine Intelligence, 20(8), 832–844.

42. Amit, Y. & Geman, D. (1997). Shape quantization and recognition with randomized trees. Neural Computation, 9, 1545–1588.

43. Cortes, C. and Vapnik, V., Support-vector networks. Machine Learning(英語:Machine Learning (journal)). 1995, 20 (3): 273–297.

44. Ben-Hur. et al., Support vector clustering" (2001) Journal of Machine Learning Research. 2001, 2: 125–137.

45. Smola, A. J., & Schölkopf, B. (2004). A tutorial on support vector regression. Statistics and computing, 14(3), 199-222.

46. Sch¨olkopf B. and Smola A.J. 2002. Learning with Kernels. MIT Press.

47. Freund, Y., & Schapire, R. E. (1995, March). A desicion-theoretic generalization of on-line learning and an application to boosting. In European conference on computational learning theory (pp. 23-37). Springer, Berlin, Heidelberg.

48. Nick Littlestone and Manfred K . Warmuth. The weighted majority algorithm. Information and Computation, 108:212-261,1994

49. Drucker, H. (1997, July). Improving regressors using boosting techniques. In ICML (Vol. 97, pp. 107-115).

50. Friedman, J. H. (1999). Stochastic gradient boosting. Computational statistics & data analysis, 38(4), 367-378.

51. Friedman, J. H. (2001). Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189-1232.


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔