|
Chang, C.-T., Huang, C.-C., Yang, C.-Y., & Hsu, J. Y.-J. (2018). A Hybrid Word-Character Approach to Abstractive Summarization. arXiv preprint arXiv:1802.09968. Chen, W.-F., Wachsmuth, H., Al Khatib, K., & Stein, B. (2018). Learning to Flip the Bias of News Headlines. Paper presented at the Proceedings of the 11th International Conference on Natural Language Generation. Chen, Y.-C., & Bansal, M. (2018). Fast abstractive summarization with reinforce-selected sentence rewriting. arXiv preprint arXiv:1805.11080. Dai, Z., Yang, Z., Yang, Y., Cohen, W. W., Carbonell, J., Le, Q. V., & Salakhutdinov, R. (2019). Transformer-xl: Attentive language models beyond a fixed-length context. arXiv preprint arXiv:1901.02860. Du, J., Xu, R., He, Y., & Gui, L. (2017). Stance classification with target-specific neural attention networks. Fan, A., Grangier, D., & Auli, M. (2017). Controllable abstractive summarization. arXiv preprint arXiv:1711.05217. Fu, Z., Tan, X., Peng, N., Zhao, D., & Yan, R. (2018). Style transfer in text: Exploration and evaluation. Paper presented at the Thirty-Second AAAI Conference on Artificial Intelligence. Gavrilov, D., Kalaidin, P., & Malykh, V. (2019). Self-Attentive Model for Headline Generation. Paper presented at the European Conference on Information Retrieval. Gehring, J., Auli, M., Grangier, D., Yarats, D., & Dauphin, Y. N. (2017). Convolutional sequence to sequence learning. Paper presented at the Proceedings of the 34th International Conference on Machine Learning-Volume 70. Hsu, W.-T., Lin, C.-K., Lee, M.-Y., Min, K., Tang, J., & Sun, M. (2018). A unified model for extractive and abstractive summarization using inconsistency loss. arXiv preprint arXiv:1805.06266. Hu, B., Chen, Q., & Zhu, F. (2015). Lcsts: A large scale chinese short text summarization dataset. arXiv preprint arXiv:1506.05865. Kågebäck, M., Mogren, O., Tahmasebi, N., & Dubhashi, D. (2014). Extractive summarization using continuous vector space models. Paper presented at the Proceedings of the 2nd Workshop on Continuous Vector Space Models and their Compositionality (CVSC). Liao, Y., Bing, L., Li, P., Shi, S., Lam, W., & Zhang, T. (2018). QuaSE: Sequence Editing under Quantifiable Guidance. Paper presented at the Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Liu, L., Lu, Y., Yang, M., Qu, Q., Zhu, J., & Li, H. (2018). Generative adversarial network for abstractive text summarization. Paper presented at the Thirty-Second AAAI Conference on Artificial Intelligence. Nallapati, R., Zhou, B., Gulcehre, C., & Xiang, B. (2016). Abstractive text summarization using sequence-to-sequence rnns and beyond. arXiv preprint arXiv:1602.06023. Nallapati, R., Zhou, B., & Ma, M. (2016). Classify or select: Neural architectures for extractive document summarization. arXiv preprint arXiv:1611.04244. Paulus, R., Xiong, C., & Socher, R. (2017). A deep reinforced model for abstractive summarization. arXiv preprint arXiv:1705.04304. Rush, A. M., Chopra, S., & Weston, J. (2015). A neural attention model for abstractive sentence summarization. arXiv preprint arXiv:1509.00685. See, A., Liu, P. J., & Manning, C. D. (2017). Get to the point: Summarization with pointer-generator networks. arXiv preprint arXiv:1704.04368. Sutskever, I., Vinyals, O., & Le, Q. V. (2014). Sequence to sequence learning with neural networks. Paper presented at the Advances in neural information processing systems. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., . . . Polosukhin, I. (2017). Attention is all you need. Paper presented at the Advances in neural information processing systems. Zhu, J.-Y., Park, T., Isola, P., & Efros, A. A. (2017). Unpaired image-to-image translation using cycle-consistent adversarial networks. Paper presented at the Proceedings of the IEEE international conference on computer vision.
|