English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 57615/91160 (63%)
造访人次 : 13556868      在线人数 : 280
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻

    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/81419

    题名: A Kinect-Based People-flow Counting System
    作者: Hsieh, Ching-Tang;Wang, Hui-Chun;Wu, Yeh-Kuang;Chang, Liung-Chun;Kuo, Tai-Ku
    贡献者: 淡江大學電機工程學系
    关键词: Kinect,8-Connected Components , People-flow counting, ROI.
    日期: 2012-11
    上传时间: 2013-03-11 20:34:29 (UTC+8)
    摘要: This paper proposed a bi-directional people-flow counting system with Kinect. It can also be applied to multi-flow to correspond with the demand for the practical application. Firstly, we set the Kinect above the doorway to capture the situation of pedestrian flow. Then this system detects people in the covering area using the depth image information from Kinect system. And we do the morphological processing like erosion tothe object and find the region of interest (ROI) often performed on using a mapping-based detection approach. After these previous steps, this system set a detected line and let people go through it. Therefore, we can get people number of the experimental result. For the multi-flow case, it will cause the occlusion problem, so we could apply the depth information to distinguish the target on occlusion problem. Final, we compare the experimental results with the manual count results and other research. Under normal circumstances, our system provides not only almost 100% for bi-directional counting but also correspond with the demand for real-time.
    關聯: IEEE International Symposium on Intelligent Signal Processing and Communication System, pp.146-150
    显示于类别:[電機工程學系暨研究所] 會議論文


    档案 描述 大小格式浏览次数
    06473470.pdf論文內容385KbAdobe PDF409检视/开启



    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回馈