English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 62805/95882 (66%)
造訪人次 : 3892492      線上人數 : 423
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    請使用永久網址來引用或連結此文件: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/81366

    題名: Memory-efficient hardware architecture of 2-D dual-mode lifting-based discrete wavelet transform
    作者: Hsia, Chih-Hsien;Chiang, Jen-Shiun;Guo, Jing-Ming
    貢獻者: 淡江大學電機工程學系
    關鍵詞: 2-D dual-mode lifting-based discrete wavelet transform;JPEG2000;low-transpose memory
    日期: 2013-04
    上傳時間: 2013-03-11 13:43:17 (UTC+8)
    出版者: Piscataway:Institute of Electrical and Electronics Engineers
    摘要: Memory requirements (for storing intermediate signals) and critical path are the essential issues for two-dimensional (2-D) (or multi-dimensional) transforms. This work presents new algorithms and hardware architectures to address the above issues in 2-D dual-mode (supporting 5/3 lossless and 9/7 lossy coding) Lifting-based Discrete Wavelet Transform (LDWT). The proposed 2-D dual-mode LDWT architecture has the merits of low-transpose memory, low latency, and regular signal flow, making it suited for VLSI implementation. The transpose memory requirement of the N×N 2-D 5/3 mode LDWT and 2-D 9/7 mode LDWT are 2N and 4N, respectively. Comparison results indicate that the proposed hardware architecture has a lower lifting-based low-transpose memory size requirement than the previous architectures. As a result, it can be applied to real-time visual operations such as JPEG2000, Motion-JPEG2000, MPEG-4 still texture object decoding, and wavelet-based scalable video coding (SVC) applications.
    關聯: IEEE Transactions on Circuits and Systems for Video Technology 23(4), pp.671-683
    DOI: 10.1109/TCSVT.2012.2211953
    顯示於類別:[電機工程學系暨研究所] 期刊論文


    檔案 描述 大小格式瀏覽次數



    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回饋