English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 49378/84106 (59%)
造訪人次 : 7380665      線上人數 : 57
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    請使用永久網址來引用或連結此文件: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/90597

    題名: Chinese text classification by the Naïve Bayes Classifier and the associative classifier with multiple confidence threshold values
    作者: Lu, Shing-Hwa;Chiang, Ding-An;Keh, Huan-Chao;Huang, Hui-Hua
    貢獻者: 淡江大學資訊工程學系
    關鍵詞: Association classification;Text classification;Text mining;Text categorization
    日期: 2010-08
    上傳時間: 2013-07-03 09:53:33 (UTC+8)
    出版者: Amsterdam: Elsevier BV
    摘要: Each type of classifier has its own advantages as well as certain shortcomings. In this paper, we take the advantages of the associative classifier and the Naïve Bayes Classifier to make up the shortcomings of each other, thus improving the accuracy of text classification. We will classify the training cases with the Naïve Bayes Classifier and set different confidence threshold values for different class association rules (CARs) to different classes by the obtained classification accuracy rate of the Naïve Bayes Classifier to the classes. Since the accuracy rates of all selected CARs of the class are higher than that obtained by the Naïve Bayes Classifier, we could further optimize the classification result through these selected CARs. Moreover, for those unclassified cases, we will classify them with the Naïve Bayes Classifier. The experimental results show that combining the advantages of these two different classifiers better classification result can be obtained than with a single classifier.
    關聯: Knowledge-Based Systems 23(6), pp. 598–604
    DOI: 10.1016/j.knosys.2010.04.004
    顯示於類別:[資訊工程學系暨研究所] 期刊論文


    檔案 描述 大小格式瀏覽次數



    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回饋