淡江大學機構典藏:Item 987654321/95944
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 62805/95882 (66%)
Visitors : 3933441      Online Users : 487
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/95944


    Title: Speech Classification Based on Fuzzy Adaptive Resonance Theory
    Authors: Hsieh, Ching-Tang;Hsu, Chih-Hsu
    Contributors: 淡江大學電機工程學系
    Keywords: Speech classification;Neuro-fuzzy system;Fuzzy ART
    Date: 2006-10
    Issue Date: 2014-02-13 11:24:45 (UTC+8)
    Abstract: This paper presents a neuro-fuzzy system to speech classification. We propose a multi-resolution feature extraction technique to deal with adaptive frame size. We utilize fuzzy adaptive resonance theory (FART) to cluster each frame. FART was an extension to ART, performs clustering of its inputs via unsupervised learning. ART describes a family of self-organizing neural networks, capable of clustering arbitrary sequences of input patterns into stable recognition codes. In our experiments, the TIMIT database is used and extracts features of each phoneme. The performance of speech classification is 88.66%, demonstrate the effectiveness of the proposed system is encouraging.
    Relation: Proceedings of 9th Joint Conference on Information Sciences,4頁
    Appears in Collections:[Graduate Institute & Department of Electrical Engineering] Proceeding

    Files in This Item:

    File SizeFormat
    Speech Classification Based on Fuzzy Adaptive Resonance Theory_英文摘要.docx21KbMicrosoft Word106View/Open

    All items in 機構典藏 are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback