|
[1] T. Joachims, “Text categorization with support vector machines: Learning with many relevant features”, ECML, Berlin: Springer, pp. 137–142, 1998. [2] Y. Freund and R. E. Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting”, J. Comput. Syst. Sci., vol. 55, no. 1, pp. 119–139, 1997. [3] R. E. Schapire and Y. Singer, “Boostexter: A boosting-based system for text categorization”, Machine Learning, vol. 39, no. 2/3, pp. 135–168, 2000. [4] R. E. Schapire and Y. Singer, “Improved boosting algorithms using confidence-rated predictions”, Machine Learning, vol. 37, pp. 297–336, December 1999. [5] D. W. Hosmer and Stanley Lemeshow, Applied Logistic Regression, 2nd ed., Wiley, 2000. [6] A. McCallum and K. Nigam, “A comparison of event models for na?e bayes text classification”, in IN AAAI-98 WORKSHOP ON LEARNING FOR TEXT CATEGORIZATION. AAAI Press, pp. 41–48, 1998. [7] T. Joachims, “Transductive inference for text classification using support vector machine”, ICML, 1999. [8] A. B. Goldberg and X. Zhu, “Seeing stars when there aren’t many stars: graph-based semi-supervised learning for sentiment categorization”, In: Proceedings of the First Workshop on Graph Based Methods for Natural Language Processing. TextGraphs-1. Association for Computational Linguistics, Stroudsburg, PA, USA, pp. 45–52, 2006. [9] A. Blum and S. Chawla, “Learning from labeled and unlabeled data using graph mincuts”, ICML, Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, pp. 19–26, 2001. [10] C. L. Liu and S. Y. Fang, “Text Classification with Labeled and Unlabeled Data using Semi-supervised AdaBoost.MH”, submitted to Information Processing & Management. [11] J. Weston and R. Collobert and F. H. Sinz and L. Bottou and V. Vapnik, “Inference with the universum”, ICML, pp. 1009-1016, 2006. [12] F. H. Sinz and O. Chapelle and A. Agarwal and B. Sch?匜kopf, “An analysis of inference with the universum”, NIPS, 2007. [13] C. Cortes and V. Vapnik, “Support-Vector Networks”, Machine Learning, 20, 1995. [14] K. Huang and Z. Xu and I. King and M. R. Lyn, “Semi-supervised learning from general unlabeled data”, ICML, pp. 273-282, 2008 [15] J. M. Bernardo and A. F. M. Smith, Bayesian theory, John Wiley and Sons, 1994. [16] F. Sinz and M. Roffilli, UniverSVM, software available at http://mloss.org/software/view/19/ [17] D. Zhang and J. Wang and F. Wang and C. Zhang, “Semi-supervised classification with universum”, SDM, pp. 323-333, 2008. [18] T. Joachims, “Transductive learning via spectral graph partitioning”, ICML, pp. 290-297, 2003. [19] B. Scholk?夗f and J. Platt and T. Hoffman, “On transductive regression”, Advances in Neural Information Processing Systems 19, 2006. [20] D. Zhang and J. Wang and L. Si, “Document Clustering with Universum”, ACM SIGIR, 2011. [21] L. Xu and J. Neufeld and B. Larson and D. Schuurmans, “Maximum margin clustering”, NIPS, 2004. [22] V. Vapnik, “The nature of statistical learning theory”, New York, NY, USA: Springer-Verlag New York, Inc., 1995.
|