Multi-view stereo reconstruction based on gated recurrent deep range prediction network
DOI:
Author:
Affiliation:

School of Automation and Electrical Engineering, Shenyang University of Technology,Shenyang 110159, China

Clc Number:

TP391

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Aiming at the problems that 3D reconstruction techniques are difficult to deal with high-resolution images, and the reconstructed point cloud maps have low accuracy and fuzzy boundaries, this paper proposes a multi-stage multi-scale dynamic depth range prediction network model based on gated recurrent units. First, a curvature-guided dynamic scale convolutional network is used as a feature extraction module to obtain the feature information of the optimal pixels of the image by calculating the surface normal curvature at multiple scales on the image; then, the fine feature information is combined with a new depth range estimation module to dynamically estimate the depth range assumptions of the next stage, so as to better merge the information of neighboring pixels, and to achieve an accurate matching between the reference image and the source image. The network in this paper is compared with more than 10 other methods, and on the DTU dataset, the overall performance is improved by 2.2% over the network in 2nd. On the Tank&Temple dataset, the reconstruction performance of the Lighthouse, M60 and Panther scenes are substantially improved. Meanwhile, comparison and ablation experiments are conducted in this paper, and the experimental results demonstrate that the dynamic depth prediction network proposed in this paper significantly improves the accuracy and completeness of the reconstructed point cloud maps while reducing the memory consumption.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 24,2024
  • Published: