English  |  正體中文  |  简体中文  |  Items with full text/Total items : 58323/91877 (63%)
Visitors : 14376522      Online Users : 119
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/103469

    Authors: Hsieh, Ching-Tang;Liou, Hsing-Che;Chen, Li-Ming;Chen, Ting-Wen
    Contributors: 電機工程學系暨研究所
    Keywords: Taiwan Sign Language (TSL), Sparse Coding, Occlusion
    Date: 2015-07-18
    Issue Date: 2015-07-27 14:12:05 (UTC+8)
    Publisher: Academy of Taiwan Information Systems Research (ATISR)
    Abstract: Sign language plays an important role in communicate with changers that
    hearing improved. However, the sign language in many countries and areas different
    and auto recognition system became the research way in recent year. In this paper, we
    devise a novel method for occlusion processing in Taiwan Sign Language recognition
    system. Our method employs adxl345 and Kinect to extract the feature of signer. Then
    the features are regulated by the dictionary of sparse coding. In final, the HMM model
    and result signs are recognized from the features that corrected by our method. In
    experimental result, we present the data that our employ. Then we describe closing
    test result and future work.
    Relation: International Conference on Internet Studies (NETs 2015)
    Appears in Collections:[Graduate Institute & Department of Electrical Engineering] Proceeding

    Files in This Item:

    File Description SizeFormat
    NETs2015_129_A_NOVEL.pdf會議文獻433KbAdobe PDF420View/Open

    All items in 機構典藏 are protected by copyright, with all rights reserved.

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback