English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 58335/91896 (63%)
造访人次 : 12252      在线人数 : 224
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻
    淡江大學機構典藏 > 體育事務處 > 會議論文 >  Item 987654321/110050

    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/110050

    题名: Non-Visual Document Recognition for Blind Reading Assistant System
    作者: Hsieh, Ching-Tang;Yeh, Cheng-Hsiang;Liu, Tsun-Te;Huang, Ku-Chen
    关键词: Assistive Device;Braille;Document Recognition
    日期: 2013-06-18
    上传时间: 2017-03-22 02:10:42 (UTC+8)
    出版者: AICIT, Korea(Rep. of)
    摘要: As time goes on, a huge mass of progressive knowledge is developed and then the e-book is built for paper reduction and environmental protection. Therefore, many historical documents over the past don’t exist. Research on blind reading aid device is a popular topic gradually, but there are some drawbacks on these methods. For example, electronic documents of many old dated of historical documents don’t read because format should be limited to e-document. Users can’t read any specified region of the document as he wishes. A novel blind reading aid device is proposed without e-document and the user only need to point the document with his finger. This system is composed of third parts. First, we use rectangle detection method to catch the region for document under the Microsoft Kinect. Next, dilation method and projection profile methods are used in order to execrate text and constructed coordinate database. Finally, skin detection, BEA method and depth image of Microsoft Kinect can get the coordinates of user’s finger when user wishes to read the document, and then match with constructed coordinate database to get the character. Then system output is obtained via text to speech.
    關聯: Research Notes in Information Science (RNIS), pp.453-458
    显示于类别:[體育事務處] 會議論文





    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回馈