基于边缘检测与注意力机制的立体匹配算法
DOI:
作者:
作者单位:

江苏大学机械工程学院 镇江 212000

作者简介:

通讯作者:

中图分类号:

TP751.2

基金项目:


Stereo matching algorithm based on edge detection and attention mechanism
Author:
Affiliation:

School of Mechanical Engineering, Jiangsu University, Zhenjiang 212000

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    随着深度学习理论的不断进步,端到端的立体匹配网络在自动驾驶和深度传感等领域取得了显著的成果。然而,最先进的立体匹配算法仍然无法精确恢复物体的边缘轮廓信息。为了提高视差预测的精度,在本研究中,提出了一种基于边缘检测与注意力机制的立体匹配算法。该算法从立体图像对中学习视差信息,并支持视差图和边缘图的端到端多任务预测。为了充分利用二维特征提取网络学习到的边缘信息,本算法提出了一种全新的边缘检测分支和多特征融合匹配代价卷。结果表明,基于本文模型的边缘检测方案有助于提高视差估计的精度,所获取的视差图在KITTI 2015测试平台上的误匹配率为1.75%,与金字塔立体匹配网络相比,视差图的精度提高了12%,且运行时间减少约20%。

    Abstract:

    With the continuous progress of deep learning theory, end-to-end stereo matching network has achieved remarkable results in the fields of automatic driving and depth sensing. However, the most advanced stereo matching algorithm still have trouble in accurately recover the edge contour information of the object. In order to improve the accuracy of disparity prediction, in this study, we propose a stereo matching algorithm based on edge detection and attention mechanism. The algorithm learns parallax information from stereo image pairs and supports end-to-end multi task prediction of parallax map and edge map. In order to make full use of the edge information learned by the two-dimensional feature extraction network, we propose a new edge detection branch and multi feature fusion matching cost volume. The results show that the edge detection scheme based on the model helps to improve the accuracy of parallax estimation. The error matching rate of the obtained parallax map on KITTI 2015 test platform is 1.75%. Compared with pyramid stereo matching network, the accuracy of parallax map is improved by 12% and the running time is reduced by 20%.

    参考文献
    相似文献
    引证文献
引用本文

余雪飞,顾寄南,黄则栋,荆彩霞.基于边缘检测与注意力机制的立体匹配算法[J].电子测量技术,2022,45(11):167-172

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-04-25
  • 出版日期: