English  |  正體中文  |  简体中文  |  Items with full text/Total items : 51772/86996 (60%)
Visitors : 8373353      Online Users : 115
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/87467

    Title: 吉尼係數離散化演算法
    Other Titles: A discretization algorithm based on class-attribute Gini index
    Authors: 黃鈺傑;Huang, Yu-Chieh
    Contributors: 淡江大學數學學系碩士班
    Keywords: 分類;決策樹;離散化;吉尼係數;classification;Decision tree;Discretization;Gini Index
    Date: 2012
    Issue Date: 2013-04-13 11:10:31 (UTC+8)
    Abstract:   由於資訊化時代的來臨以及網際網路的蓬勃發展,資料數量以驚人速度成長,因此,現今的研究議題著重於如何從大量的資料中有效地擷取隱藏其中且具有參考價值的資訊。由於許多機器學習演算法只處理離散數值資料及名目資料;然而連續屬性資料是常見的資料形式,離散化演算法可以把連續屬性值分割成有限個離散區間,精簡資料的複雜度;不僅能使我們更容易了解資料的分佈和特性,也解決了機器學習演算法對連續屬性處理不易的限制。在這項研究中,提出了CAGI(Class-Attribute Gini Index)離散化演算法,並與CAIM(Class-Attribute Interdependence Maximization)離散化演算法以及CACC(Class-Attribute Contingency Coefficient)離散化演算法比較。根據實驗顯示,CAGI離散化演算法在某些資料集的表現不但可以更正確地離散連續屬性之資料,亦可提升分類器的預測準確度。
      Due to the information age and the rapid development of Internet, the amount of data grows rapidly. Thus, the research topic nowadays focuses on how to capture valuable information among a large amount of data efficiently. The majority of machine learning algorithms can be applied only to data described by discrete numerical or nominal attributes, but continuous attribute data is the most common form of data. Discretization algorithm can divide a continuous attribute’s values into a finite number of intervals and simplify the complexity of data. It not only makes us easier to understand the distribution and characteristic of data, but ends the restriction of machine learning algorithms. In this paper, we propose CAGI(Class-Attribute Gini Index) discretization algorithm and compare with CAIM(Class-Attribute Interdependence Maximization) discretization algorithm and CACC(Class-Attribute Contingency Coefficient) discretization algorithm. The result of experiment shows that in some dataset, CAGI discretization algorithm not only discretizes the data more correctly, but improves the accuracy of classification.
    Appears in Collections:[數學學系暨研究所] 學位論文

    Files in This Item:

    File SizeFormat

    All items in 機構典藏 are protected by copyright, with all rights reserved.

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback