English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 54137/89020 (61%)
造访人次 : 10553405      在线人数 : 19
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/74711


    题名: 基於影像特徵點之全方位視覺里程計的設計
    其它题名: Image feature-based omni-directional visual odometry design
    作者: 羅一喬;Luo, I-Chiao
    贡献者: 淡江大學電機工程學系碩士班
    翁慶昌;Wong, Ching-Chang
    关键词: 視覺里程計;全方位視覺系統;特徵點匹配;Visual odometry;Omni-directional vision system;Feature matching;SURF
    日期: 2011
    上传时间: 2011-12-28 19:22:09 (UTC+8)
    摘要: 本論文提出一個基於影像特徵點之全方位視覺里程計的設計與實現方式。主要有4個部分:(1)距離模型的建立,(2)環境特徵點的選取,(3)特徵點匹配(Feature Matching)與(4)視覺里程計輸出。在距離模型的建立上,本論文以有理函數插值法來取代傳統方法,此方法可以降低全方位視覺之取樣點,並得到精確至像素之全方位視覺距離模型。在環境特徵點的選取上,由於加速強健特徵點(Speed Up Robust Feature, SURF)演算法可得到大量強健特徵點之特性,本論文用SURF演算法來得到影像中每一訊框之環境特徵。在特徵點匹配上,本論文提出一個主維度優先搜尋法(Main Dimensional Priority Search)來取代傳統K維樹法(K-Dimensional Tree)。在視覺里程計輸出上,本論文以動態估測(Motion Estimation)之步驟來計算機器人之動態與相對移動數值,得到視覺里程計之輸出。從實驗結果可知,在移動與旋轉的整體效能上,本論文所提出之視覺里程計確實較輪型里程計為佳,而且視覺里程計對於地形與摩擦力並不敏感,所以本論文所提出之視覺里程計可以取代輪型里程計來作為感測器融合之感測資訊用。
    An image feature based omni-directional visual odometry is designed and implemented in this thesis. There are four parts: (1) Distance model building, (2) Feature extraction, (3) Feature matching, and (4) Output of visual odometry. In the distance model building, a rational function interpolation method is used to build the distance model. In comparison with the traditional model regulated method, it can reduce the sampling points of model to get the precise distance between pixels of the omni-directional visual models. In the feature extraction, the SURF (Speed Up Robust Feature) algorithm is used to get environmental features of each frame, because it can detect a lot of features and these features are robust. In the feature matching, a new matching method called the main dimensional priority searching method is proposed. In comparison with the K-dimensional tree searching method, it can remove the step of building searching tree so that the matching speed will increase by 450%. In the output of visual odometry, the motion estimation method is used to calculate the relative movement of the robot motion and information. From experimental results, we can see the overall performance of the proposed visual odometry is better than that of the traditional wheeled odometry.
    显示于类别:[電機工程學系暨研究所] 學位論文

    文件中的档案:

    档案 大小格式浏览次数
    index.html0KbHTML168检视/开启

    在機構典藏中所有的数据项都受到原著作权保护.

    TAIR相关文章

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回馈