淡江大學機構典藏:Item 987654321/123743
English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 64191/96979 (66%)
造访人次 : 8440342      在线人数 : 8274
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/123743


    题名: Eco-feller: Minimizing the Energy Consumption of Random Forest Algorithm by an Eco-pruning Strategy over MLC NVRAM
    作者: Liang, Yu-Pei;Hsu, Yung-Han;Chen, Tseng-Yi;Chen, Shuo-Han;Wei, Hsin-Wen;Hsu, Tsan-sheng;Shih, Wei-Kuan
    日期: 2021-12-05
    上传时间: 2023-04-28 19:21:33 (UTC+8)
    摘要: Random forest has been widely used to classifying objects recently because of its efficiency and accuracy. On the other hand, nonvolatile memory has been regarded as a promising candidate to be a part of a hybrid memory architecture. For achieving the higher accuracy, random forest tends to construct lots of decision trees, and then conducts some post-pruning methods to fell low contribution trees for increasing the model accuracy and space utilization. However, the cost of writing operations is always very high on non-volatile memory. Therefore, writing the to-be-pruned trees into non-volatile memory will significantly waste both energy and time. This work proposed a framework to ease such hurt of training a random forest model. The main spirit of this work is to evaluate the importance of trees before constructing it, and then adopts different writing modes to write the trees to the non-volatile memory space. The experimental results show the proposed framework can significantly mitigate the waste of energy with high accuracy.
    DOI: 10.1109/DAC18074.2021.9586164
    显示于类别:[電機工程學系暨研究所] 會議論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML86检视/开启

    在機構典藏中所有的数据项都受到原著作权保护.

    TAIR相关文章

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回馈