Loading...
|
javax.net.ssl.SSLException: Received fatal alert: protocol_version
Loading...
|
Please use this identifier to cite or link to this item:
https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/119176
|
Title: | A passive approach for effective detection and localization of region-level video forgery with spatio-temporal coherence analysis |
Authors: | Lin, Cheng-Shian;Tsay, Jyh-Jong |
Keywords: | Passive video forgery detection;Temporal copy-and-paste;Exemplar-based texture synthesis;Spatio-temporal slice analysis;Region-level inpainting |
Date: | 2014-06 |
Issue Date: | 2020-09-23 12:10:41 (UTC+8) |
Abstract: | In this paper, we present a passive approach for effective detection and localization of region-level forgery from video sequences possibly with camera motion. As most digital image/video capture devices do not have modules for embedding watermark or signature, passive forgery detection which aims to detect the traces of tampering without embedded information has become the major focus of recent research. However, most of current passive approaches either work only for frame-level detection and cannot localize region-level forgery, or suffer from high false detection rates for localization of tampered regions. In this paper, we investigate two common region-level inpainting methods for object removal, temporal copy-and-paste and exemplar-based texture synthesis, and propose a new approach based on spatio-temporal coherence analysis for detection and localization of tampered regions. Our approach can handle camera motion and multiple object removal. Experiments show that our approach outperforms previous approaches, and can effectively detect and localize regions tampered by temporal copy-and-paste and texture synthesis. |
Relation: | Digital Investigation 11(2), pp.120–140 |
DOI: | 10.1016/j.diin.2014.03.016 |
Appears in Collections: | [資訊工程學系暨研究所] 期刊論文
|
Files in This Item:
File |
Description |
Size | Format | |
index.html | | 0Kb | HTML | 120 | View/Open |
|
All items in 機構典藏 are protected by copyright, with all rights reserved.
|