English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 50122/85141 (59%)
造访人次 : 7884048      在线人数 : 40
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻

    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/81366

    题名: Memory-efficient hardware architecture of 2-D dual-mode lifting-based discrete wavelet transform
    作者: Hsia, Chih-Hsien;Chiang, Jen-Shiun;Guo, Jing-Ming
    贡献者: 淡江大學電機工程學系
    关键词: 2-D dual-mode lifting-based discrete wavelet transform;JPEG2000;low-transpose memory
    日期: 2013-04
    上传时间: 2013-03-11 13:43:17 (UTC+8)
    出版者: Piscataway:Institute of Electrical and Electronics Engineers
    摘要: Memory requirements (for storing intermediate signals) and critical path are the essential issues for two-dimensional (2-D) (or multi-dimensional) transforms. This work presents new algorithms and hardware architectures to address the above issues in 2-D dual-mode (supporting 5/3 lossless and 9/7 lossy coding) Lifting-based Discrete Wavelet Transform (LDWT). The proposed 2-D dual-mode LDWT architecture has the merits of low-transpose memory, low latency, and regular signal flow, making it suited for VLSI implementation. The transpose memory requirement of the N×N 2-D 5/3 mode LDWT and 2-D 9/7 mode LDWT are 2N and 4N, respectively. Comparison results indicate that the proposed hardware architecture has a lower lifting-based low-transpose memory size requirement than the previous architectures. As a result, it can be applied to real-time visual operations such as JPEG2000, Motion-JPEG2000, MPEG-4 still texture object decoding, and wavelet-based scalable video coding (SVC) applications.
    關聯: IEEE Transactions on Circuits and Systems for Video Technology 23(4), pp.671-683
    DOI: 10.1109/TCSVT.2012.2211953
    显示于类别:[電機工程學系暨研究所] 期刊論文


    档案 描述 大小格式浏览次数



    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回馈