淡江大學機構典藏:Item 987654321/70551
English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 62819/95882 (66%)
造訪人次 : 3999065      線上人數 : 318
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    請使用永久網址來引用或連結此文件: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/70551


    題名: Robust Multiple Targets Tracking Using Object Segmentation and Trajectory Estimation in Video
    作者: Hsiao, Ying-Tung;Chuang, Cheng-Long;Jiang, Joe-Air;Chien, Cheng-Chih
    貢獻者: 淡江大學電機工程學系
    日期: 2005-10-10
    上傳時間: 2011-10-23 21:26:01 (UTC+8)
    出版者: N.Y.: IEEE (Institute of Electrical and Electronic Engineers)
    摘要: In this paper, a novel robust unsupervised video object tracking algorithm is proposed. The proposed algorithm combines several techniques: mathematical morphology, region growing, region merging, and trajectory estimation, for tracking several predetermined video objects, simultaneously. A modified mathematical morphological edge detector was employed to sketch the contour of the video frame; and an edge-based object segmentation algorithm was applied to the contour for partitioning the predetermined objects; moreover, according to the motion of the objects, the proposed algorithm can estimate and partition the objects in following video frames, automatically. The proposed algorithm is also robustness against mobile cameras. The experimental results show that the proposed algorithm can precisely partition and track multiple video objects
    關聯: Systems, Man and Cybernetics, 2005 IEEE International Conference on, Hawaii, USA, pp.1080-1085
    DOI: 10.1109/ICSMC.2005.1571289
    顯示於類別:[電機工程學系暨研究所] 會議論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML284檢視/開啟
    Robust Multiple Targets Tracking Using Object Segmentation and Trajectory Estimation in Video.pdf727KbAdobe PDF275檢視/開啟

    在機構典藏中所有的資料項目都受到原著作權保護.

    TAIR相關文章

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回饋