English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 54451/89232 (61%)
造访人次 : 10572846      在线人数 : 37
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/70551


    题名: Robust Multiple Targets Tracking Using Object Segmentation and Trajectory Estimation in Video
    作者: Hsiao, Ying-Tung;Chuang, Cheng-Long;Jiang, Joe-Air;Chien, Cheng-Chih
    贡献者: 淡江大學電機工程學系
    日期: 2005-10-10
    上传时间: 2011-10-23 21:26:01 (UTC+8)
    出版者: N.Y.: IEEE (Institute of Electrical and Electronic Engineers)
    摘要: In this paper, a novel robust unsupervised video object tracking algorithm is proposed. The proposed algorithm combines several techniques: mathematical morphology, region growing, region merging, and trajectory estimation, for tracking several predetermined video objects, simultaneously. A modified mathematical morphological edge detector was employed to sketch the contour of the video frame; and an edge-based object segmentation algorithm was applied to the contour for partitioning the predetermined objects; moreover, according to the motion of the objects, the proposed algorithm can estimate and partition the objects in following video frames, automatically. The proposed algorithm is also robustness against mobile cameras. The experimental results show that the proposed algorithm can precisely partition and track multiple video objects
    關聯: Systems, Man and Cybernetics, 2005 IEEE International Conference on, Hawaii, USA, pp.1080-1085
    DOI: 10.1109/ICSMC.2005.1571289
    显示于类别:[電機工程學系暨研究所] 會議論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML201检视/开启
    Robust Multiple Targets Tracking Using Object Segmentation and Trajectory Estimation in Video.pdf727KbAdobe PDF201检视/开启

    在機構典藏中所有的数据项都受到原著作权保护.

    TAIR相关文章

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回馈