淡江大學機構典藏:Item 987654321/118139
English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 62797/95867 (66%)
造訪人次 : 3730760      線上人數 : 628
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    請使用永久網址來引用或連結此文件: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/118139


    題名: Visually Guided Picking Control of an Omindirectional Mobile Manipulator Based on End-to-End Imitation Learning
    作者: Tsai, Chi-Yi;Huang, Chien-Che;Lai, Yu-Cheng;Chou, Yung-Shan;Wong, Ching-Chang
    關鍵詞: Omnidirectional mobile manipulator;visually guided picking control;deep learning, multi-task imitation learning;end-to-end control
    日期: 2020-01-06
    上傳時間: 2020-02-25 12:11:00 (UTC+8)
    出版者: IEEE
    摘要: In this paper, a novel deep convolutional neural network (CNN) based high-level multi-task
    control architecture is proposed to address the visual guide-and-pick control problem of an omnidirectional
    mobile manipulator platform based on deep learning technology. The proposed mobile manipulator control
    system only uses a stereo camera as a sensing device to accomplish the visual guide-and-pick control
    task. After the stereo camera captures the stereo image of the scene, the proposed CNN-based high-level
    multi-task controller can directly predict the best motion guidance and picking action of the omnidirectional
    mobile manipulator by using the captured stereo image. In order to collect the training dataset, we manually
    controlled the mobile manipulator to navigate in an indoor environment for approaching and picking
    up an object-of-interest (OOI). In the meantime, we recorded all of the captured stereo images and the
    corresponding control commands of the robot during the manual teaching stage. In the training stage,
    we employed the end-to-end multi-task imitation learning technique to train the proposed CNN model by
    learning the desired motion and picking control strategies from prior expert demonstrations for visually
    guiding the mobile platform and then visually picking up the OOI. Experimental results show that the
    proposed visually guided picking control system achieves a picking success rate of about 78.2% on average.
    關聯: IEEE Access 8, p.1882-1891
    DOI: 10.1109/ACCESS.2019.2962335
    顯示於類別:[電機工程學系暨研究所] 期刊論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML126檢視/開啟

    在機構典藏中所有的資料項目都受到原著作權保護.

    TAIR相關文章

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回饋