|
參考文獻 Bengio, Y. (2009). Learning Deep Architectures for AI. Foundations and Trends® in Machine Learning, 2(1), 1–127. Bengio, Y., Courville, A., & Vincent, P. (2012). Representation Learning: A Review and New Perspectives. arXiv:1206.5538 [Cs]. Retrieved from http://arxiv.org/abs/1206.5538 Bengio, Y., Simard, P., & Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks, 5(2), 157–166. Bordes, A., Chopra, S., & Weston, J. (2014). Question Answering with Subgraph Embeddings. arXiv:1406.3676 [Cs]. Retrieved from http://arxiv.org/abs/1406.3676 Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., & Kuksa, P. (2011). Natural Language Processing (Almost) from Scratch. J. Mach. Learn. Res., 12, 2493–2537. Deng, L. (2014). Deep Learning: Methods and Applications. Foundations and Trends® in Signal Processing, 7(3–4), 197–387. Graves, A. (2013). Generating Sequences With Recurrent Neural Networks. arXiv:1308.0850 [Cs]. Retrieved from http://arxiv.org/abs/1308.0850 Graves, A., & Jaitly, N. (2014). Towards End-To-End Speech Recognition with Recurrent Neural Networks (pp. 1764–1772). Presented at the Proceedings of the 31st International Conference on Machine Learning (ICML-14). Retrieved from http://machinelearning.wustl.edu/mlpapers/papers/icml2014c2_graves14 Graves, A., Liwicki, M., Fernández, S., Bertolami, R., Bunke, H., & Schmidhuber, J. (2009). A Novel Connectionist System for Unconstrained Handwriting Recognition. IEEE Trans. Pattern Anal. Mach. Intell., 31(5), 855–868. Graves, A., Mohamed, A., & Hinton, G. (2013). Speech Recognition with Deep Recurrent Neural Networks. arXiv:1303.5778 [Cs]. Retrieved from http://arxiv.org/abs/1303.5778 Green, B. F., Jr., Wolf, A. K., Chomsky, C., & Laughery, K. (1961). Baseball: An Automatic Question-answerer. In Papers Presented at the May 9-11, 1961, Western Joint IRE-AIEE-ACM Computer Conference (pp. 219–224). New York, NY, USA: ACM. Green, B., Wolf, A., Chomsky, C., & Laughery, K. (1986). Readings in Natural Language Processing. In B. J. Grosz, K. Sparck-Jones, & B. L. Webber (Eds.) (pp. 545–549). San Francisco, CA, USA: Morgan Kaufmann Publishers Inc. Retrieved from http://dl.acm.org/citation.cfm?id=21922.24354 Han, L., Yu, Z.-T., Qiu, Y.-X., Meng, X.-Y., Guo, J.-Y., & Si, S.-T. (2008). Research on passage retrieval using domain knowledge in Chinese question answering system. In 2008 International Conference on Machine Learning and Cybernetics (Vol. 5, pp. 2603–2606). Hao, X., Chang, X., & Liu, K. (2007). A Rule-based Chinese Question Answering System for Reading Comprehension Tests. In Third International Conference on Intelligent Information Hiding and Multimedia Signal Processing, 2007. IIHMSP 2007 (Vol. 2, pp. 325–329). Hihi, S. E., & Bengio, Y. (1996). Hierarchical Recurrent Neural Networks for Long-Term Dependencies. In D. S. Touretzky & M. E. Hasselmo (Eds.), Advances in Neural Information Processing Systems 8 (pp. 493–499). MIT Press. Retrieved from http://papers.nips.cc/paper/1102-hierarchical-recurrent-neural-networks-for-long-term-dependencies.pdf Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735–1780. Hu, B., Lu, Z., Li, H., & Chen, Q. (2014). Convolutional Neural Network Architectures for Matching Natural Language Sentences. In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, & K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 27 (pp. 2042–2050). Curran Associates, Inc. Retrieved from http://papers.nips.cc/paper/5550-convolutional-neural-network-architectures-for-matching-natural-language-sentences.pdf Huang, J., Zhou, M., & Yang, D. (2007). Extracting Chatbot Knowledge from Online Discussion Forums. In Proceedings of the 20th International Joint Conference on Artifical Intelligence (pp. 423–428). San Francisco, CA, USA: Morgan Kaufmann Publishers Inc. Retrieved from http://dl.acm.org/citation.cfm?id=1625275.1625342 Ittycheriah, A., Franz, M., Zhu, W., Ratnaparkhi, A., & Mammone, R. J. (2001). IBM’s Statistical Question Answering System. ResearchGate. Retrieved from https://www.researchgate.net/publication/2875435_IBM’s_Statistical_Question_Answering_System Jean, S., Cho, K., Memisevic, R., & Bengio, Y. (2014). On Using Very Large Target Vocabulary for Neural Machine Translation. arXiv:1412.2007 [Cs]. Retrieved from http://arxiv.org/abs/1412.2007 Kiros, R., Salakhutdinov, R., & Zemel, R. S. (2014). Unifying Visual-Semantic Embeddings with Multimodal Neural Language Models. arXiv:1411.2539 [Cs]. Retrieved from http://arxiv.org/abs/1411.2539 Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). ImageNet Classification with Deep Convolutional Neural Networks. In F. Pereira, C. J. C. Burges, L. Bottou, & K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 25 (pp. 1097–1105). Curran Associates, Inc. Retrieved from http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf Meng, Y., Rumshisky, A., & Romanov, A. (2017). Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture. arXiv:1703.05851 [Cs]. Retrieved from http://arxiv.org/abs/1703.05851 Mikolov, T., Deoras, A., Povey, D., Burget, L., & Černocký, J. (2011). Strategies for training large scale neural network language models. In 2011 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU) (pp. 196–201). Mikolov, T., Sutskever, I., Chen, K., Corrado, G., & Dean, J. (2013). Distributed Representations of Words and Phrases and their Compositionality. arXiv:1310.4546 [Cs, Stat]. Retrieved from http://arxiv.org/abs/1310.4546 Nguyen, M.-T., Phan, V.-A., Nguyen, T.-S., & Nguyen, M.-L. (2016). Learning to rank questions for community question answering with ranking svm. arXiv Preprint arXiv:1608.04185. Retrieved from https://arxiv.org/abs/1608.04185 Pascanu, R., Mikolov, T., & Bengio, Y. (2012). On the difficulty of training Recurrent Neural Networks. arXiv:1211.5063 [Cs]. Retrieved from http://arxiv.org/abs/1211.5063 Ravichandran, D., & Hovy, E. (2002). Learning Surface Text Patterns for a Question Answering System. In Proceedings of the 40th Annual Meeting on Association for Computational Linguistics (pp. 41–47). Stroudsburg, PA, USA: Association for Computational Linguistics. Riloff, E., & Thelen, M. (2000). A Rule-based Question Answering System for Reading Comprehension Tests. In Proceedings of the 2000 ANLP/NAACL Workshop on Reading Comprehension Tests As Evaluation for Computer-based Language Understanding Sytems - Volume 6 (pp. 13–19). Stroudsburg, PA, USA: Association for Computational Linguistics. Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1. In D. E. Rumelhart, J. L. McClelland, & C. PDP Research Group (Eds.) (pp. 318–362). Cambridge, MA, USA: MIT Press. Retrieved from http://dl.acm.org/citation.cfm?id=104279.104293 Sainath, T. N., Mohamed, A.-R., Kingsbury, B., & Ramabhadran, B. (2013). Deep convolutional neural networks for LVCSR. In ResearchGate (pp. 8614–8618). Schmidhuber, J. (2015). Deep Learning in Neural Networks: An Overview. Neural Networks, 61, 85–117. Song, H. A., & Lee, S.-Y. (n.d.). Hierarchical Representation Using NMF. In SpringerLink (pp. 466–473). Springer Berlin Heidelberg. SPARQL Query Language for RDF. (n.d.). Retrieved November 19, 2016, from https://www.w3.org/TR/rdf-sparql-query/ Sutskever, I. (2013). Training recurrent neural networks. University of Toronto. Retrieved from https://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdf Sutskever, I., Martens, J., & Hinton, G. E. (2011). Generating Text with Recurrent Neural Networks. In ResearchGate (pp. 1017–1024). Retrieved from https://www.researchgate.net/publication/221345823_Generating_Text_with_Recurrent_Neural_Networks Sutskever, I., Vinyals, O., & Le, Q. V. (2014). Sequence to Sequence Learning with Neural Networks. In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, & K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 27 (pp. 3104–3112). Curran Associates, Inc. Retrieved from http://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., … Rabinovich, A. (2014). Going Deeper with Convolutions. arXiv:1409.4842 [Cs]. Retrieved from http://arxiv.org/abs/1409.4842 Unger, C., Bühmann, L., Lehmann, J., Ngonga Ngomo, A.-C., Gerber, D., & Cimiano, P. (2012). Template-based Question Answering over RDF Data. In Proceedings of the 21st International Conference on World Wide Web (pp. 639–648). New York, NY, USA: ACM. Wang, B., Liu, K., & Zhao, J. (2016). Inner attention based recurrent neural networks for answer selection. In The Annual Meeting of the Association for Computational Linguistics. Retrieved from http://www.aclweb.org/anthology/P/P16/P16-1122.pdf Woods, W. A. (1973). Progress in Natural Language Understanding: An Application to Lunar Geology. In Proceedings of the June 4-8, 1973, National Computer Conference and Exposition (pp. 441–450). New York, NY, USA: ACM. Yih, W., Chang, M.-W., Meek, C., Pastusiak, A., Yih, S. W., & Meek, C. (2013). Question Answering Using Enhanced Lexical Semantic Models. Microsoft Research. Retrieved from https://www.microsoft.com/en-us/research/publication/question-answering-using-enhanced-lexical-semantic-models/ Yu, L., Hermann, K. M., Blunsom, P., & Pulman, S. (2014). Deep Learning for Answer Sentence Selection. arXiv:1412.1632 [Cs]. Retrieved from http://arxiv.org/abs/1412.1632 Zhang, K., & Zhao, J. (2010). A Chinese question-answering system with question classification and answer clustering. In 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD) (Vol. 6, pp. 2692–2696).
|