English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 62830/95882 (66%)
造访人次 : 4047300      在线人数 : 647
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/111398


    题名: 運用快速偵測法於視覺里程計
    其它题名: Fast detection algorithms for visual odometry
    作者: 姚琮獻;Yao, Tsung-Hsien
    贡献者: 淡江大學機械與機電工程學系碩士班
    孫崇訓;Sun, Chung-Hsun
    关键词: 視覺里程計;直立式加速強健特徵;參數化三點透視;隨機化隨機取樣一致;循序跳越式盒子濾波器;Visual Odometry(VO);upright speeded-up robust features(U-SURF);parameterized perspective-three-point(parameterized P3P);randomized random sample consensus(R-RANSAC);ordinal skip box filter
    日期: 2016
    上传时间: 2017-08-24 23:51:48 (UTC+8)
    摘要: 本論文針對視覺里程計(Visual Odometry, VO) 運用幾個演算法對即時性進行改善。首先利用雙眼視覺系所統擷取影像,影像使用直立式加速強健特徵(Upright Speeded-Up Robust Features, U-SURF)偵測地標點。加入循序跳越式盒子濾波器(Ordinal Skip Box Filter)以減少在U-SURF中對影像迴積的次數與計算時間。之後利用參數化三點透視(Parameterized Perspective-Three-Point algorithms)演算法反推視覺系統可能位置。再者利用隨機化隨機取樣一致(Randomized Random Sample Consensus, R-RANSAC)演算法以剔除其他錯誤位置解。最後以地面基準實驗證明改善效果。
    In this paper, several algorithms are applied to improve the instantaneity of the visual odometry system. First, the binocular stereo camera is used to capture images. The upright speeded-up robust features (U-SURF) algorithm is used to detect features in the images. Ordinal skip box filter added in U-SURF can reduce convolution times and computation time. Then the possible positions of the camera can be determined by the parameterized perspective-three-point algorithm (Parameterized P3P). Next, the randomized random sample consensus (R-RANSAC) algorithm is used to eliminate other error position solutions. Finally, improvement performance is shown on the ground truth experiment.
    显示于类别:[機械與機電工程學系暨研究所] 學位論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML132检视/开启

    在機構典藏中所有的数据项都受到原著作权保护.

    TAIR相关文章

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - 回馈