|
[1] James Allan. 2002. Introduction to topic detection and tracking. Topic detection and tracking pages 1–16. [2] Paul M Aoki, Margaret H Szymanski, Luke Plurkowski, James D Thornton, Allison Woodruff, and Weilie Yi. 2006. Where’s the party in multiparty?: Analyzing the structure of small-group sociable talk. In CSCW’06. ACM, pages 393–402. [3] Yoshua Bengio, Rejean Ducharme, Pascal Vincent, and Christian Janvin. 2003. A neural probabilistic language model. JMLR, 3:1137–1155. [4] Yann N. Dauphin, Angela Fan, Michael Auli, and David Grangier. 2016. Language modeling with gated convolutional networks. arXiv Preprint. arXiv: 1612.08083. [5] Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805. [6] Micha Elsner and Eugene Charniak. 2008. You talking to me? a corpus and algorithm for conversation disentanglement. In ACL’08. ACL, pages 834–842. [7] Micha Elsner and Eugene Charniak. 2010. Disentangling chat. Computational Linguistics 36(3):389– 409. [8] Micha Elsner and Eugene Charniak. 2011. Disentangling chat with local coherence models. In ACLHLT’11. ACL, pages 1179–1189. [9] Jyun-Yu Jiang, Francine Chen, Yang-Ying Chen, and Wei Wang. 2018. Learning to Disentangle Interleaved Conversational Threads with a Siamese Hierarchical Network and Similarity Ranking. In Proceedings of NAACL. [10] Ryan Kiros, Yukun Zhu, Ruslan Salakhutdinov, Richard S. Zemel, Antonio Torralba, Raquel Urtasun, and Sanja Fidler. 2015. Skip-Thought Vectors. In Proceedings of NIPS. [11] Lajanugen Logeswaran and Honglak Lee. 2018. An efficient framework for learning sentence representations. In ICLR. [12] Ryan Lowe, Nissan Pow, Iulian Serban, and Joelle Pineau. 2015. The ubuntu dialogue corpus: A large dataset for research in unstructured multi-turn dialogue systems. arXiv preprint arXiv:1506.08909. [13] Elijah Mayfield, David Adamson, and Carolyn Penstein Ros´e. 2012. Hierarchical conversation structure prediction in multi-party chat. In SIGDIAL’12. ACL, pages 60–69. [14] Shikib Mehri and Giuseppe Carenini. 2017. Chat disentanglement: Identifying semantic reply relationships with random forests and recurrent neural networks. In IJCNLP’17. volume 1, pages 615–623. [15] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. 2013a. Efficient Estimation of Word Representations in Vector Space. In ICLR Workshop Papers. [16] Jeffrey Pennington, Richard Socher, and Christopher D. Manning. 2014. Glove: Global vectors for word representation. In EMNLP. [17] Matthew Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, and Luke Zettlemoyer. 2018. Deep contextualized word representations. In NAACL. [18] Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever. 2018. Improving language understanding with unsupervised learning. Technical report, OpenAI. [19] Dou Shen, Qiang Yang, Jian-Tao Sun, and Zheng Chen. 2006. Thread detection in dynamic text message streams. In SIGIR’06. ACM, pages 35–42. [20] Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Advances in Neural Information Processing Systems, pages 6000–6010. [21] Lidan Wang and Douglas W Oard. 2009. Contextbased message expansion for disentanglement of interleaved text conversations. In NAACL’09. ACL, pages 200–208. [22] TensorFlow, https://www.tensorflow.org/ [23] freenode, https://freenode.net/
|