English  |  正體中文  |  简体中文  |  Items with full text/Total items : 49378/84106 (59%)
Visitors : 7376659      Online Users : 78
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/103467


    Title: SPEECH ENHANCEMENT BASED ON SPARSE THEORY UNDER NOISY ENVIRONMENT
    Authors: Hsieh, Ching-Tang;Chen, Yan-heng;Chen, Ting-Wen;Chen, Li-Ming
    Contributors: 電機工程學系暨研究所
    Keywords: Speech enhancement, sparse representations, K-SVD, discrete cosine transform (DCT), orthogonal matching pursuit (OMP)
    Date: 2015-07-18
    Issue Date: 2015-07-27 14:05:46 (UTC+8)
    Publisher: Academy of Taiwan Information Systems Research (ATISR)
    Abstract: Recently, the sparse algorithm for sparse enhancement is more and more popular issues. In this paper, we classify the process of the sparse theory to enhance speech signal into two parts, one is for dictionary training part and the other is signal reconstruction part. We focus on the White Gaussian Noise. Clean speech dictionary D is trained by K-SVD algorithm. The orthogonal matching pursuit(OMP) algorithm is used to obtain the sparse coefficients X of clean speech dictionary D. Denoising performance of the experiments shows that our proposed method is superior than other methods in SNR, LLR, SNRseg and PESQ.
    Relation: International Conference on Internet Studies (NETs 2015)
    Appears in Collections:[電機工程學系暨研究所] 會議論文

    Files in This Item:

    File Description SizeFormat
    NETs2015_127_SPEECH_ENHANCEMENT.pdf會議文獻1012KbAdobe PDF304View/Open

    All items in 機構典藏 are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback