|
[1] Picard R W. Affective computing[M]. Cambridge: MIT press, 1997. [2] Kiritchenko S, Mohammad S M. Capturing reliable fine-grained sentiment associations by crowdsourcing and best–worst scaling[C]//Proceedings of The 15th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL), San Diego, California. 2016. [3] Pang B, and Lee L. Opinion mining and sentiment analysis[J]. Foundations and Trends in Information Retrieval, 2008, 2(1-2): 1-135 [4] Bradley M M, Lang P J. Affective Norms for English words (ANEW): Instruction manual and affective ratings[R]. Technical report C-1, the center for research in psychophysiology, University of Florida, 1999. [5] Mikolov T, Sutskever I, Chen K, et al. Distributed representations of words and phrases and their compositionality[C]//Advances in neural information processing systems. 2013: 3111-3119. [6] Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space[J]. arXiv preprint arXiv:1301.3781, 2013. [7] Astudillo R F, Amir S, Ling W, et al. Inesc-id: A regression model for large scale twitter sentiment lexicon induction[J]. SemEval-2015, 2015: 613. [8] Warriner A B, Kuperman V, Brysbaert M. Norms of valence, arousal, and dominance for 13,915 English lemmas[J]. Behavior research methods, 2013, 45(4): 1191-1207. [9] Yogatama D, Faruqui M, Dyer C, et al. Learning word representations with hierarchical sparse coding[C]//Proc. of ICML. 2015. [10] Faruqui M, Dodge J, Jauhar S K, et al. Retrofitting word vectors to semantic lexicons[J]. arXiv preprint arXiv:1411.4166, 2014. [11] Taboada M, Brooke J, Tofiloski M, et al. Lexicon-based methods for sentiment analysis[J]. Computational linguistics, 2011, 37(2): 267-307. [12] Goldberg Y, Levy O. word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding method[J]. arXiv preprint arXiv:1402.3722, 2014. [13] Tang D, Wei F, Qin B, et al. Building Large-Scale Twitter-Specific Sentiment Lexicon: A Representation Learning Approach[C]//COLING. 2014: 172-182. [14] Kiritchenko S, Mohammad S M. Capturing reliable fine-grained sentiment associations by crowdsourcing and best–worst scaling[C]//Proceedings of The 15th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL), San Diego, California. 2016. [15] Kiritchenko S, Mohammad S M. The effect of negators, modals, and degree adverbs on sentiment composition[C]//Proceedings of the Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (WASSA). 2016. [16] 藺璜,郭姝慧. 程度副詞的特點範圍與分類[J]. 山西大學學報(哲學社會科學版). 2003(02) [17] Mikolov T. Statistical language models based on neural networks[J]. Presentation at Google, Mountain View, 2nd April, 2012. [18] Turian J, Ratinov L, Bengio Y. Word representations: a simple and general method for semi-supervised learning[C]//Proceedings of the 48th annual meeting of the association for computational linguistics. Association for Computational Linguistics, 2010: 384-394. [19] Lin J C, Wu C H, Wei W L. Error weighted semi-coupled hidden Markov model for audio-visual emotion recognition[J]. IEEE Transactions on Multimedia, 2012, 14(1): 142-156. [20] Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model[J]. journal of machine learning research, 2003, 3(Feb): 1137-1155. [21] Huang E H, Socher R, Manning C D, et al. Improving word representations via global context and multiple word prototypes[C]//Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers-Volume 1. Association for Computational Linguistics, 2012: 873-882. [22] Bian J, Gao B, Liu T Y. Knowledge-powered deep learning for word embedding[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer Berlin Heidelberg, 2014: 132-148. [23] Faruqui M, Dyer C. Improving vector space word representations using multilingual correlation[C]. Association for Computational Linguistics, 2014. [24] Xu C, Bai Y, Bian J, et al. Rc-net: A general framework for incorporating knowledge into word representations[C]//Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management. ACM, 2014: 1219-1228. [25] Yu M, Dredze M. Improving Lexical Embeddings with Semantic Knowledge[C]//ACL (2). 2014: 545-550. [26] Botha J A, Blunsom P. Compositional Morphology for Word Representations and Language Modelling[C]//ICML. 2014: 1899-1907. [27] Levy O, Goldberg Y. Dependency-Based Word Embeddings[C]//ACL (2). 2014: 302-308. [28] Luong T, Socher R, Manning C D. Better Word Representations with Recursive Neural Networks for Morphology[C]//CoNLL. 2013: 104-113. [29] Pennington J, Socher R, Manning C D. Glove: Global Vectors for Word Representation[C]//EMNLP. 2014, 14: 1532-43.
|