|
1. 臺北市地價問題研究,人人文庫,台北市, (p.49-p.105), (1980).
2. 顏聰玲: 不動產市場分析與預測試(初版) ,新文京開發出版, (2004).
3. Frew J., and G. D. Jud, 2003, “Estimating The Value of Apartment Buildings”, The Journal of Real Estate Research, 25(1): 77 - 86.
4. Calhoun C. A., 2003, “Property Valuation Models and House Price Indexes for The Provinces of Thailand: 1992 – 2000”, Housing Finance International, 17(3): 31 – 41
5. Limsombunchai. House Price Prediction: Hedonic Price Model vs. Artificial Neural Network. 2004.
6. Pow, N., Janulewicz, E. & Liu, L. (2014). Applied Machine Learning Project 4 Prediction of real estate property prices in Montreal, Available: http://rl.cs.mcgill.ca/comp598/fall2014/comp598_submission_99.pdf.
7. Bradford Case et. al.: Machine Learning: Modeling Spatial and Temporal House Price Patterns: A Comparison of Four Models, 2004.
8. Abdul G. Sarip1 and Muhammad Burhan Hafez. Fuzzy Logic Application for House Price Prediction. 2015.
9. Jiaoyang Wu. Housing Price prediction Using Support Vector Regression. 2017.
10. MSendhil Mullainathan and Jann Spiess: Machine Learning: An Applied Econometric Approach, 2017.
11. Kahn, J.: What drives housing prices? Federal Reserve Bank of New York Staff Reports, New York, USA, (2008).
12. Lowrance, E.R.: Predicting the market value of single-family residential real estate. 1st edn. PhD diss., New York University, (2015).
13. Bork, M., Moller, V.S.: House price forecast ability: a factor analysis. Real Estate Economics. Heidelberg (2016).
14. David Stadelmann. Which factors capitalize into house prices? A Bayesian averaging approach. 2010.
15. Pardoe, I.: Modeling home prices using realtor data. 16(2), 1-9 (2008).
16. Limsombunchao, V., House price prediction: hedonic price model vs. artificial neural network. Lincoln University, NZ, (2004).
17. MSendhil Mullainathan and Jann Spiess, Machine Learning: An Applied Econometric Approach. 2017.
18. Ayush Varma, Abhijit Sarma, Sagar Doshi and Rohini Nair. House Price Prediction Using Machine Learning And Neural Networks. 2018
19. Adamantios Ntakaris1, Giorgio Mirone2, Juho Kanniainen1, Moncef Gabbouj 1, and Alesandros Iosofodois., Feature Engineering for Mid-Price Prediction With Deep Learning. 2019.
20. Cheng Fana, Yongjun Sunb, Yang Zhaoc,⁎, Mengjie Songd, Jiayuan Wanga., Deep learning-based feature engineering methods for improved building energy prediction. 2019.
21. Susan Athey, Mohsen Bayati, Guido Imbens, and Zhaonan Qu., Ensemble Methods for Causal Effects in Panel Data Settings. 2019.
22. Guido Imbens and Khashayar Khosravi. Matrix Completion Methods for Causal Panel Data Models. 2018.
23. Heng Shi, Minghao Xu, and Ran Li, Member, Deep Learning for Household Load Forecasting—A Novel Pooling Deep RNN. 2018.
24. Maryam M Najafabadi1, Flavio Villanustre2, Taghi M Khoshgoftaar1,Naeem Seliya1, Randall Wald1* and Edin Muharemagic3, Deep learning applications and challenges in big data analytics. 2015.
25. Wei-Yin Loh, Classification and regression tree 2011.
26. Wei-Yin Loh, Random Decision Forests. 1995.
27. Robin Genuer, Jean-Michel Poggi, Christine Tuleau-Malot, Variable selection using Random Forests. 2012.
28. Leo Breiman, Random Forests. 2001.
29. J.A.K. SUYKENS and J. VANDEWALLE, Least Squares Support Vector Machine Classifiers. 1999.
30. Aurlien Gron, Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems (1st. ed.). O’Reilly Media, Inc.2017.
31. Kim, S. J., Koh, K., Lustig, M., Boyd, S., & Gorinevsky, D, An interior-point method for large-scale $\ell_1 $-regularized least squares. IEEE journal of selected topics in signal processing, 1(4), 606-617. 2007.
32. Tin Kam Ho. Random decision forests. Proceedings of 3rd International Conference on Document Analysis and Recognition (Montreal, Que., Canada: IEEE Comput. Soc. Press). 1995. 1: 278–282.
33. Tin Kam Ho, The random subspace method for constructing decision forests. Expert systems with Applications, IEEE Transactions on Pattern Analysis and Machine Intelligence. Aug./1998, 20 (8): 832–844.
34. Ho, Tin Kam, A Data Complexity Analysis of Comparative Advantages of Decision Forest Constructors, Pattern Analysis and Applications. 2002: p.102–112 [2019-02-16].
35. Leo Breiman, RANDOM FORESTS. Statistics Department University of California Berkeley, CA 94720, 2001.
36. Hastie., et al. The Elements of Statistical Learning. 2008. ISBN 0-387-95284-5.
37. Freund, Y. & Schapire, R. (1996). Experiments with a new boosting algorithm, Machine Learning: Proceedings of the Thirteenth International Conference, 148–156
38. Breiman, L. (1996a). Bagging predictors. Machine Learning 26(2), 123–140.
39. Dietterich, T.(1998). An experimental comparison of three methods for constructing ensembles ofdecision trees: Bagging, boosting and randomization, Machine Learning,1–22.
40. Breiman, L. 1999. Using adaptive bagging to debias regressions. Technical Report 547, Statistics Dept. UCB.
41. Ho, T. K. (1998). The random subspace method for constructing decision forests. IEEE Trans. On Pattern Analysis and Machine Intelligence, 20(8), 832–844.
42. Amit, Y. & Geman, D. (1997). Shape quantization and recognition with randomized trees. Neural Computation, 9, 1545–1588.
43. Cortes, C. and Vapnik, V., Support-vector networks. Machine Learning(英語:Machine Learning (journal)). 1995, 20 (3): 273–297.
44. Ben-Hur. et al., Support vector clustering" (2001) Journal of Machine Learning Research. 2001, 2: 125–137.
45. Smola, A. J., & Schölkopf, B. (2004). A tutorial on support vector regression. Statistics and computing, 14(3), 199-222.
46. Sch¨olkopf B. and Smola A.J. 2002. Learning with Kernels. MIT Press.
47. Freund, Y., & Schapire, R. E. (1995, March). A desicion-theoretic generalization of on-line learning and an application to boosting. In European conference on computational learning theory (pp. 23-37). Springer, Berlin, Heidelberg.
48. Nick Littlestone and Manfred K . Warmuth. The weighted majority algorithm. Information and Computation, 108:212-261,1994
49. Drucker, H. (1997, July). Improving regressors using boosting techniques. In ICML (Vol. 97, pp. 107-115).
50. Friedman, J. H. (1999). Stochastic gradient boosting. Computational statistics & data analysis, 38(4), 367-378.
51. Friedman, J. H. (2001). Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189-1232.
|