跳到主要內容

臺灣博碩士論文加值系統

(3.229.117.123) 您好!臺灣時間:2022/08/12 17:49
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:簡朝宗
研究生(外文):Chau-Tzung Chien
論文名稱:可塑性認知網路應用於電信營運客戶群分類之研究
論文名稱(外文):The Study of Plastic Perceptrons The Study of Plastic Perceptrons Applied in the Classification of Telecom Customers
指導教授:周義昌
指導教授(外文):I-Chang Jou
學位類別:碩士
校院名稱:國立高雄第一科技大學
系所名稱:電腦與通訊工程所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2002
畢業學年度:90
語文別:中文
論文頁數:105
中文關鍵詞:自動群集偵測資料採礦決策樹自組織映射圖網路倒傳遞網路認知機網路類神經網路可塑性認知網路
外文關鍵詞:Automatic cluster detectionDecision treeSelf-Organizing Feature MapData MiningArtificial Neural NetworkBack-propagationPerceptronplastic perceptrons networks
相關次數:
  • 被引用被引用:14
  • 點閱點閱:467
  • 評分評分:
  • 下載下載:143
  • 收藏至我的研究室書目清單書目收藏:3
可塑性認知網路於電信營運客戶群分類之研究

摘要

在類神經網路的許多模組,都有一種自然傾向能力,就是將已經定義好的資料分類成不同之類別型式,提供給資料分析者很容易去解讀。
本論文所提出的是使用可塑性認知網路來分類電信營運流失客戶群,其目的是為了要分類出已經與中華電信公司解除租用關係的高消費群客戶,分析這些客戶所具有的消費特性,並以此為借鏡,加強對現有高消費群客戶之服務。我們之所以採用可塑性認知網路主要是它具有以下特性使然:
首先,可塑性認知網路是將傳統的認知網路加以分解,使每一個子網路代表一個分類群類別,所以每一個子網路的輸出層都只有一個節點,隱藏層的節點數也比傳統認知機網路少很多,並且所有的子網路都是互相獨立、平行處理的。其次,可塑性認知網路具有可塑性,當一個系統的所有子網路都已經學習訓練完成了或是不但訓練完成而且已經運用在辨識工作上之時,都可以再依據需要另外訓練完成某些子網路來加入原網路中或替代原網路中的某些子網路,不像傳統的類神經網路,必須放棄辛辛苦苦訓練完成的原來網路,再按照新網路架構重新學習訓練。最後,在學習速度方面,由於可塑性認知網路本身具有於學習過程中選擇性更新加權值之特性,因此可以加速學習的收斂速度,其意義又與倒傳遞類神經網路之選擇性更新特性有同功之妙。
從模擬試驗我們可以得到以下之分類結果:
1、在內部分類測試(Inside testing)上辨識率為100%。
2、在外部分類測試(Outside testing)上辨識率為99.75%。
3、系統的分類準確度為97.5%。
針對流失的重要客戶群所做的分析,我們建議中華電信公司對現有的重要客戶群採取以下之行動方案:
1、保持市場領導者地位。
2、採用單一費率。
3、加強大客戶行動及國際電話之優惠行銷
4、加強行動虛擬私人網路通信行銷。
5、積極開拓數據用戶市場。
The Study of Plastic Perceptrons Applied in the Classification of Telecom Customers


Student: Chau-Tzung Chien
Advisors:DR. I-Chang Jou



Department of Computer and Communication Engineering
National Kaohsiung First University of Science and Technology

ABSTRACT

In some Neural Networks model, it has a natural propensity to classifying well-defined data into visually distinct classes, which can then be easily interpretable by the data analysts.
In this thesis, We propose the use of a Plastic Perceptrons neural network (PP) for classifying the churns of Chunghwa Telecom(CHT). The purpose is to classify the churns into several clusters and analyze the characteristics of the most important cluster who has high consumption and high traffic. Then, some strategies are to be adapted in order to serve the CHT’s customers avoiding from churn. Why we using the PP ? since PP has the following features:
Firstly, The PP can be decomposed a traditional Perceptrons networks as several sub-network components, and each sub-network has only one output node. The overall of nodes in all sub-networks is fewer than traditional Perceptrons network. Secondly, an important feature of the PP is that each sub-network can be trained independently and parallelly, in the training phase. Each sub-network also operates independently and all input pattern can be processed parallelly through all sub-networks in retrieving phase. Thirdly, once a new class is added, an additional sub-network will be trained to determining the connection weights of that sub-network and joint it to the PP. Similarly, when a class will be removed, there is only the corresponding sub-network need to be deleted. In contrast to the traditional Perceptrons, we will give up the original Preceptrons network’s connection weights and retraining the networks. Lastly, owing to the PP has the features of selective update in the training phase , so it can speed up the system’s convergence rate, this means that resembling in the selective update Back-Propagation neural network algorithm.
From our simulation. The simulation results are as follows:
1.The classification rate in inside testing is 100%.
2.The classification rate in outside testing is 99.75%.
3.The system accuracy of classification is 97.5%.
After finished the churn’s data analysis, We present the following recommendations to CHT:
1.Keeping the leadership of market in telecom.
2.Adopting single tariff in voice communication service.
3. Marketing the mobile and international voice-phone business for the VIP customers.
4.Marketing the MVPN(Mobile Virtual Private Networks).
5.Broadening the data communication services.
目錄
中文摘要 i
英文摘要 ii
誌謝 iv
目錄 v
表目錄 ix
圖目錄 x
壹、緒論 1
一、研究動機 1
二、論文架構 1
貳、分類技術之探討 3
一、分類的特性 3
(一)一般分類具有的特性 3
(二)目前資料分類的方法 4
二、自動群集偵測 4
(一)引用K平均演算法 4
(二)自動群集偵測的優勢 6
(三)自動群集偵測的缺點 6
三、決策樹 8
(一)決策樹的方法 8
(二)決策樹的優點 9
(三)決策樹的弱點 10
四、類神經網路 12
(一)類神經網路之定義及原理 12
(二)數學式及層的觀念 13
(三)網路之運作方式 15
(四)學習法則-監督式學習與無監督式學習 16
(五)與統計經驗模式系統之比較 16
(六)常用的轉換函數 17
(七)類神經網路的優點 19
(八)類神經網路的缺點 19
(九)何時應用類神經網路 20
參、常見類神經網路技術模型 22
一、自組織映射圖網路 23
(一)網路架構 23
(二)網路演算法 27
二、認知機網路 29
(一)MP模型 29
(二)認知機網路 30
(三)網路演算法 31
三、倒傳遞網路 32
(一)倒傳遞網路架構及特性 33
(二)網路演算法 34
(三)決定各層參數及修正學習速率技巧分析 36
肆、可塑性認知網路 46
一、可塑性認知網路架構 46
二、網路演算法 47
三、內部測試與外部測試 56
伍、電信市場分析、模擬環境與資料描述 58
一、國內外電信市場分析 58
(一)國外(美、英、日)新固網業者市場佔有率分析 58
(二)我國新進固網業者概況及經營策略分析 58
(三)Home Pass滲透率分析 61
(四)中華電信公司的因應對策 63
二、模擬環境 65
(一)硬體配備 65
(二)使用軟體 65
三、資料源及處理流程 65
(一)資料源 65
(二)分類的定義及準備學習樣本 66
(三)特徵向量分析 69
(四)資料處理流程 70
陸、結果與討論 73
一、試驗結果 73
(一)圖示學習結果 73
(二)進行內部辨認測試 73
(三)進行外部辨認測試 75
(四)分類辨識結果輸出 75
(五)已流失之重要客戶群的消費分析 78
二、討論 80
柒、結論與未來研究方向 85
一、結論 85
(一)保持市場領導者之地位 86
(二)採用單一費率 86
(三)加強大客戶行動及國際電話之優惠行銷 86
(四)加強行銷行動群組通信(MVPN) 86
(五)積極開發數據用戶市場 87
二、未來研究方向 87
參考文獻 88
附錄 92
1.周義昌、蔡玉娟、劉榮宜、李素玲、戴敏倫,”類神經網路新架構:可塑性認知網路,”電信研究季刊,第22卷第3期,P291-306,1992。2.M.J. Berry and G. Linoff,”資料採礦,” 彭文正譯,維科圖書公司,台北, 1997。3.M.J. Berry and G. Linoff,”資料採礦理論與實務,” 吳旭志、賴淑貞譯,維科圖書公司,台北, 2000。4.葉怡成,”類神經網路模式應用與實作,”儒林圖書公司,台北,2001。5.羅華強,”類神經網路-Matlab的應用,”清蔚科技公司,新竹,2001。6.張智星,”Matlab程式設計與應用,”清蔚科技公司,新竹,2001。7. L.h. Witten and E. Frank,”Data Mining,”Morgan Kaufmann Publishers,2000。8. T.Z. Byoung,”Artificial Neural Networks,”2001。9. H. Tsukimoto,”Extracting rules from trained neural networks,”IEEE Transactions on neural networks , Vol.11 Issue: 2,P377-389,2000.10.R. Setiono and K.L. Wee,”Generating rules from trained network using fast pruning,”IJCNN ''99. International Joint Conference on , Vol.6,P4095-4098,1999.11.M. Brameier and w. Banzhaf ,“A comparison of linear genetic programming and neural networks in medical data mining,” IEEE Transactions on Evolutionary Computation,vol.5 No.1,P17-26, 2001.12.S.S.R. Abidi and J. Ong,“A data mining strategy for inductive data clustering: a synergy between self-organising neural networks and K-means clustering techniques, ” Proceedings , Volume: 2 ,P568 -573, 2000.13.K.S. Chung, T.Y. Ui, K.K. Huy and C.P. Sang,” A hybrid approach of neural network and memory-based learning to data mining,” IEEE Transactions on Neural Networks, Vol.11 Issue:3 , P637-646, 2000.14.B.L. Chua, M. Khalid and R. Yusof, “An enhanced intelligent database engine by neural network and data mining, Proceedings,” Volume: 2,P518-523, 2000.15.M. Sato and H. Tsukimoto, ”Rule extraction from neural networks via decision tree induction,”IEEE Transactions on Neural Networks,P1870-1875, 2001.16.C.R. Palmer and C. Faloutsos,”Density biased sampling- an improved method for data mining and clustering,”ACM,P82-91, 2000.17.A. Kusiak,”Feature transformation methods in data mining,”IEEE Transactions on Electronics Packaging Manufacturing, Vol.24 Issue: 3 P214-221, 2001.18.M.S. Chen, H. Jiawei and S.Y. Philip,”Data mining: An overview from a database perspective,”IEEE Transactions on Knowledge and Data Engineering, 1996.19.B. Widrow,”Layered neural nets for pattern recognition,” IEEE Transactions on Accoustics Speech and Signal Processing”, P1109-1117, 1988.20.R.P. Lippmann,”An introduction to computing with neural nets,”IEEE ASSP Magazine,P4-22, 1987.21.T. Kohonen, ”The self-organizing map,” vol.78 no.9, P1464-1480, 1990.22.T. Kohonen,” Things you haven''t heard about the self-organizing map,” IEEE International Conference on Neural Networks, vol.3, P1147-1156 , 1993.23. A.A. Dingle, J.H. Andreae and R.D. Jones,”The chaotic self-organizing map,”Artificial Neural Networks and Expert Systems, Proceedings., First New Zealand International Two-Stream Conference on ,P15-18 , 1993.24.M. Peura,”The self-organizing map of attribute trees,” 1999. ICANN 99. Ninth International Conference on Artificial Neural Networks, (Conf. Publ. No. 470) , Vol.1, P168-173, 1999.25.R.A. Jacobs,”Increased rates of convergence through learning rate adaption,” Neural Networks,vol.1,P295-307, 1988.26.D.E. Rumelhart, G.E. Hinton and R.J. Williams,”Learning internal representation by error propagation,”in Parallel Distributed Processing,vol.1 P318-362, 1986.27.D. Tveter,”Better speed through integers,”AI EXPERT,P39-46 , 1990.28.D. Tveter,”Getting a fast break with BACKPROP,”AI EXPERT, P36-43, 1991.29.P.D. Wasserman,”A combined back-propagation/Cauchy machine network,”j. of Neural Network Computing, P34-40, 1990.30.Cho, B. Sung and J.H. Kim,”A fast back-propagation learning method using Aitken’s △2 process,”Int. J. of Neural Networks,vol.2,P37-47, 1991.31.A.A. Minia and R.D. Williams,”Acceleration of back-propagation through learning rate and momentum adaptation,”IJCNN-90, P676-679, 1990.32.A. Schultz,”Differentiating similar patterns using a weight-decay term,” J. of Neural Networks,vol.2,P5-14, 1991.33.J. Sietsma and R. Dow,”Neural net pruning-why and how,”ICNN-87, P325-333, 1987.34.T. Kimoto and K. Asakawa,”Stock market predication system with modular networks,”IJCNN-90,P1-6, 1990.35.B. Apolloni,”Diagnosis of epilepsy via back-propagation,” IJCNN-90,P571-574, 1990.36.R.F. Harrison, S.J. Marshall and R.L. Kennedy,”The early diagnosis of hart attacks: a neurocomputational approach,” IJCNN-91,P1-5, 1991.37.A.A. Somaie, A. Badr and T. Salah,”Aircraft image recognition using back-propagation,” International Conference on, Proceedings , P498-501, 2001.38. Y.H. Liu and H.P. Huang,”Off-line recognition of a handwritten Chinese zither score,” IEEE International Conference on Systems, Man, and Cybernetics, Vol.4 ,P2632-2637, 2001.39.L. Steve, C.G. Lee and A.C. Tsoi,” Lessons in neural network training-overfitting may be harder,”American Association for Artifical Intelligence, 1997.40. H. Czap,”Construction and interpretation of multi-layer- perceptrons,” IEEE International Conference on Systems, Man, and Cybernetics,vol.5 , P3349-3354, 2001.41. ITU/BDT, “Telecommunication Statistics Collection and Dissemination in Taiwan”, The 2nd World Telecommunication Indicators Meeting Geneva, 29-31 March 1999.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊