English  |  正體中文  |  简体中文  |  Items with full text/Total items : 51921/87065 (60%)
Visitors : 8473417      Online Users : 137
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/92287


    Title: SIGNER-INDEPENDENT SIGN LANGUAGE RECOGNITION BASED ON HMMs AND DEPTH INFORMATION
    Authors: 謝景棠;林芃翔;陳力銘;王蕙君;謝杰甫
    Contributors: 淡江大學電機工程學系
    Keywords: Hidden Markov Models;Sign Language Recognition;Depth information
    Date: 2013-08
    Issue Date: 2013-09-25 19:28:09 (UTC+8)
    Publisher: 宜蘭:宜蘭大學
    Abstract: In this paper, we use the depth information to effectively locate the 3D position of hands in sign language recognition system. But the information will be changed by different signers and we can’t do recognition well. Here, we use the incremental changes of the three- dimensional coordinates on a unit time as feature set to fix the above problem. And we use hidden Markov models(HMMs) as time-varying classifier to recognize the moving change of sign language on time domain. We also include HMMs with scaling factor to solve the underflow effect of HMMs. Experiments verify that the proposed method is superior then traditional one.
    Relation: 第二十六屆電腦視覺、圖學暨影像處理研討會暨國科會智慧計算學門成果發表及新進人員研習會=The 26th Conferenve on Computer Vision, Graphics, and Image Processing, 5p.
    Appears in Collections:[電機工程學系暨研究所] 會議論文

    Files in This Item:

    File Description SizeFormat
    F1-3 SIGNER-INDEPENDENT SIGN LANGUAGE RECOGNITION BASED ON HMMs AND DEPTH INFORMATION.pdf論文全文532KbAdobe PDF513View/Open

    All items in 機構典藏 are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback