English  |  正體中文  |  简体中文  |  Items with full text/Total items : 62830/95882 (66%)
Visitors : 4059086      Online Users : 554
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/123442


    Title: Distilling One- Stage Object Detection Network via Decoupling Foreground and Background Features
    Authors: Tsai, San-Hao;Yen, Shwu-Huey;Lin, Hwei-Jen
    Keywords: Knowledge Distillation;Decoupling Features
    Date: 2022-10-24
    Issue Date: 2023-04-28 18:08:07 (UTC+8)
    Abstract: To deploy an object detection system into an application, the size of the system is one of key issues. We distill a teacher’s knowledge into our small scale network, where the teacher is a full-scale architecture with good performance. In addition to KL divergence loss, we propose a cosine similarity loss on foreground features to encourage student to learn the feature direction of teacher’s. This leads an efficient and robust learning from teacher model. We also propose an adaptive learning criteria which makes student model learns from teacher only when teacher has a better performance than student’s. The proposed student model has an improvement of 34.85% on ResNet34 and 39.67% on ResNet18 when Teacher model is on ResNet50.
    Relation: Proceedings of CICET 2022
    Appears in Collections:[資訊工程學系暨研究所] 會議論文

    Files in This Item:

    File SizeFormat
    index.html0KbHTML41View/Open

    All items in 機構典藏 are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback