English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 60696/93562 (65%)
造訪人次 : 1046765      線上人數 : 20
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    請使用永久網址來引用或連結此文件: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/80814

    題名: A new video object segmentation algorithm using the morphological technique
    作者: Chiang, Jen-Shiun;h Chang, Chih-Yue;Hsia, Chih-Hsien;Chang, Wei-Hsuan
    貢獻者: 淡江大學電機工程學系
    關鍵詞: Edge detection;mathematical morphology;video object segmentation
    日期: 2008
    上傳時間: 2013-03-07 14:01:13 (UTC+8)
    出版者: New York: Institute of Electrical and Electronics Engineers (IEEE)
    摘要: In this paper, we propose a new video object segmentation algorithm using the morphological technique. Several video object segmentation algorithms use mathematical morphology to generate the object masks, however the operations of the mathematical morphology have two drawbacks: (1) high computation complexity and (2) the quality of the object masks depends on the chosen morphological structuring element. There are many techniques to speed up the morphological operations by hardware implementation, but without discussions about reducing the influence of the choice of the structuring elements. By adding a pre-processing mechanism, the proposed algorithm effectively reduce the influences of the chosen structuring elements based on continuity of shape features and times of morphological operations. Experimental results show that our algorithm can improve the speed of filling operations of the object masks and accuracy of segmentation.
    關聯: Ubi-Media Computing, 2008 First IEEE International Conference on, pp.255-260
    DOI: 10.1109/UMEDIA.2008.4570899
    顯示於類別:[電機工程學系暨研究所] 會議論文


    檔案 描述 大小格式瀏覽次數
    A New Video Object Segmentation Algorithm using the Morphological Technique.pdf214KbAdobe PDF216檢視/開啟



    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回饋