English  |  正體中文  |  简体中文  |  Items with full text/Total items : 51962/87093 (60%)
Visitors : 8510917      Online Users : 83
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/34957

    Title: 以亮度變化分析為基礎偵測場景轉換
    Other Titles: Scene change detection based on illumination analysis
    Authors: 戴良光;Dai, Liang-kuang
    Contributors: 淡江大學資訊工程學系碩士班
    林慧珍;Lin, Hwei-jen
    Keywords: 場景轉換偵測;場景轉換;Shot detection;Shot boundary
    Date: 2008
    Issue Date: 2010-01-11 05:49:25 (UTC+8)
    Abstract: 在網際網路發達的今日,影像和視訊資料快速地充斥在多媒體資訊網路上。對於多媒體資訊而言,特別是視訊資料,具有最多樣的資訊、最豐富的內容,集最多可能的應用。
    對於視訊資料的組織和索引而言,場景切割是不可或缺的工作。一般而言,場景片段通常被視作組成視訊資料的基本單位,因此場景轉換偵測(video-shot-boundary detection)是提供我們做視訊摘要、檢索、瀏覽和甚至更高階視訊切割的基礎,而場景切割的主要目的在於修補、修改、剪輯等方面。
    In recent years, due to the rapid progress of computer science, internet is flooded with multimedia data, especially video, which has the most variety of information. For video on the organization and indexing, the video shot boundary detection is essential work. In general, the shot is usually regarded as the basic unit of the video, video shot boundary detection is to provide us to do video summary, retrieval and browsing. In this system, we use illumination change data to detect shot boundary based on HSV color space and to check shot change type. In the experiment result, the most common shot change types which are cut, fade and dissolve has high degree of accuracy of detection.
    Appears in Collections:[資訊工程學系暨研究所] 學位論文

    Files in This Item:

    File SizeFormat

    All items in 機構典藏 are protected by copyright, with all rights reserved.

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback