English  |  正體中文  |  简体中文  |  Items with full text/Total items : 52333/87441 (60%)
Visitors : 9110323      Online Users : 188
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/94445

    Title: 以樣本為基礎之超解析技術
    Other Titles: Example-based super resolution technique
    Authors: 鐘文德;Chung, Wen-Te
    Contributors: 淡江大學資訊工程學系碩士班
    Keywords: 超解析;鄰居內嵌法;super-resolution;neighbor embedding;PSNR;SSIM
    Date: 2013
    Issue Date: 2014-01-23 14:39:19 (UTC+8)
    Abstract: 本論文針對Chang et al. 所提之基於鄰居內嵌法的超解析演算法,提出了一個改進的版本。取代Chang et al.採用歐式距離尋找K個最接近之鄰居之方法,我們定義一個相似度Similarity,來尋找K個最相似之鄰居。相似度定義內含有兩區塊內導數的標準差,因而能找出更適當的鄰居。除此,我們也改進線性組合係數的取法,進而有效地改進整個超解析之結果。
    In this paper, we propose an improved version of the neighbor-based super-resolution algorithm proposed by Chang et al.. Different from Chang’s method that uses the Euclidean distance to find the K most nearest neighbors of a low-resolution patch, we define a similarity function and use it to find the K most similar neighbors of a low-resolution patch. In addition, we use a set of different weights for taking a linear combination of the high-resolution patches corresponding to the selected K most nearest neighbors. Although the set of the weights used by Chang minimizes the error they defined, the reconstructed high-resolution images by our method have better PSNR and SSIM than those constructed by Chang’s method.
    Appears in Collections:[資訊工程學系暨研究所] 學位論文

    Files in This Item:

    File SizeFormat

    All items in 機構典藏 are protected by copyright, with all rights reserved.

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback