跳到主要內容

臺灣博碩士論文加值系統

(18.204.48.64) 您好!臺灣時間:2021/08/01 10:24
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:吳旻誠
研究生(外文):Min-Cheng Wu
論文名稱:使用類神經網路增進自動郵件摘要效能之研究
論文名稱(外文):Improving Email Summarization By Neural Networks
指導教授:劉長遠
指導教授(外文):Cheng-Yuan Liou
口試委員:呂育道劉俊緯
口試委員(外文):Yuh-Dauh LyuuJiun-Wei Liou
口試日期:2015-06-24
學位類別:碩士
校院名稱:國立臺灣大學
系所名稱:資訊工程學研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2015
畢業學年度:103
語文別:英文
論文頁數:41
中文關鍵詞:自動郵件摘要類神經網路遞迴式類神經網路機器學習自然語言處理
外文關鍵詞:Email SummarizationNeural NetworkRecurrent Neural NetworkMachine LearningNatural Language Processing
相關次數:
  • 被引用被引用:0
  • 點閱點閱:135
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
本論文提出了兩個自動郵件摘要的模型,其一是延伸了過去既有的
使用機器學習方法的自動郵件摘要模型,我們以類神經網路來改善其
特徵抽取的方法,並提出了新的郵件摘要特徵,在同樣使用支援向量
迴歸模型來進行迴歸分析的實驗下,此模型有效的增進了既有模型的
效能。
另外我們提出了完全使用類神經網路進行郵件摘要的模型,此模型基
於以類神經網路自動抽取出的文句特徵,並以遞迴式類神經網路來模
擬人類選取摘要時所進行的動作,此模型雖在實驗中表現效能較差,
但其不需要任何的人類定義特徵,完全使用類神經網路完成自動郵件
摘要的動作。

In this thesis , we use recent neural network sentence representation technique
and propose some new feature in the email summarization research area
to improve the performance of email summarization task , and we proposed a
new neural network summarization model that imitate the prodcedure of human
summarizing the document , then we compare the result of these models.

口試委員審定書 i
摘要ii
Abstract iii
Contents iv
List of Figures vi
List of Tables vii
1 Introduction 1
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Email Summarization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Neural Network Sentence Feature Extraction . . . . . . . . . . . . . . . 2
1.4 Evaluation Metrics for Email Summarization Task . . . . . . . . . . . . . 4
1.4.1 Rouge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4.2 Weighted Recall . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2 Improving Regression Based Email Summarization By Neural Networks 8
2.1 Paragraph Vector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.1.1 Hierarchical Softmax . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.2 Negative Sampling . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.1.3 Continous Bag of Words Model(CBOW) . . . . . . . . . . . . . 11
2.1.4 Continous Skip-Gram Model . . . . . . . . . . . . . . . . . . . . 15
2.2 Linguistic Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.2.1 Follow Quote Feature . . . . . . . . . . . . . . . . . . . . . . . 17
2.2.2 Cue Phrases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.2.3 Title Similarity . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.2.4 Topic Similarity . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.2.5 Similarity to the neighbor sentence . . . . . . . . . . . . . . . . . 18
2.3 Modify Clue Word Summarization . . . . . . . . . . . . . . . . . . . . . 18
2.3.1 Construct the fragment quotation graph . . . . . . . . . . . . . . 19
2.3.2 Construct Sentence Graph . . . . . . . . . . . . . . . . . . . . . 20
2.4 Question Describe Feature . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.4.1 Question Detection . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.4.2 Question Answer Feature . . . . . . . . . . . . . . . . . . . . . . 23
2.4.3 Sentiment Feature . . . . . . . . . . . . . . . . . . . . . . . . . 24
2.5 Regression Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
2.6 Recurrent Neural Network Summarization Model . . . . . . . . . . . . . 24
3 Experiments 26
3.1 Bc3 corpus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.2 Experiment Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4 Conclusion 28
5 Summary of three models 29
v

[1] Y. Bengio, R. Ducharme, P. Vincent. A neural probabilistic language model. Journal of Machine Learning Research, 3:1137-1155, 2003.
[2] Socher, R., Huang, E. H., Pennington, J., Ng, A. Y., and Manning,C. D. (2011a).
Dynamic pooling and unfolding recursive autoencoders for paraphrase detection. In NIPS’2011.
[3] Quoc V. Le and Tomas Mikolov. Distributed Representations of Sentences and Documents. International Conference on Machine Learning (ICML). 2014
[4] Mikolov, Tomas, et al. ”Recurrent neural network based language model.” INTERSPEECH.2010.
[5] Ulrich J., Murray G., Carenini G., A Publicly Available Annotated Corpus for Supervised Email Summarization AAAI08 EMAIL Workshop, Chicago, USA, 2008.
[6] Ulrich, J., Carenini, G., Murray, G., Ng, R.: Regression-based summarization of email conversations.In: 3rd Int’l AAAI Conference on Weblogs and Social Media
(ICWSM 2009),San Jose, CA. AAAI, Menlo Park. 2009
[7] Rambow, O., Shrestha, L., Chen, J., Lauridsen, C.: Summarizing email threads.
In: HLTNAACL 2004: Proceedings of HLT-NAACL 2004: Short Papers on XX,Morristown, NJ,USA, pp. 105– 108. Association for Computational Linguistics.2004
[8] Carenini, G., Ng, R.T., Zhou, X.: Summarizing emails with conversational cohesion and subjectivity. In: Proceedings of ACL 2008: HLT, Columbus, Ohio, pp. 353–
361. Association for Computational Linguistics. 2008
[9] Muresan, S., Tzoukermann, E., and Klavans, J. L.Combining linguistic and machine learning techniques for email summarization.Workshop at ACL/ EACL 2001 Conference. 2001
[10] Lin, C. (2004). Rouge: a package for automatic evaluation of summaries.Proceedings of the Workshop on Text Summarization Branches Out.2004
[11] J. Elman. Finding Structure in Time. Cognitive Science, 14, 179-211, 1990.
[12] Andrew L. Maas, Raymond E. Daly, Peter T. Pham, Dan Huang, Andrew Y. Ng, and Christopher Potts. (2011). Learning Word Vectors for Sentiment Analysis. The 49th
Annual Meeting of the Association for Computational Linguistics (ACL 2011).
[13] Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean.
Distributed Representations of Words and Phrases and their Compositionality. In
Proceedings of NIPS, 2013.
[14] F. Morin, Y. Bengio. Hierarchical Probabilistic Neural Network Language Model.
AISTATS,2005.
[15] L. Shrestha and K. McKeown. Detection of question-answer pairs in email conversations.
In Proc. of COLING, 2004.

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top