[1]徐魁,海洋,李晓辉,等.基于图像重建的深度估计方法[J].计算机技术与发展,2024,34(05):73-79.[doi:10.20165/j.cnki.ISSN1673-629X.2024.0043]
 XU Kui,HAI Yang,LI Xiao-hui,et al.Depth Estimation Method Based on Image Reconstruction[J].,2024,34(05):73-79.[doi:10.20165/j.cnki.ISSN1673-629X.2024.0043]
点击复制

基于图像重建的深度估计方法()

《计算机技术与发展》[ISSN:1006-6977/CN:61-1281/TN]

卷:
34
期数:
2024年05期
页码:
73-79
栏目:
媒体计算
出版日期:
2024-05-10

文章信息/Info

Title:
Depth Estimation Method Based on Image Reconstruction
文章编号:
1673-629X(2024)05-0073-07
作者:
徐魁1海洋1李晓辉2陶军3
1. 宝鸡市公安局通信处,陕西 宝鸡 721014;2. 宝鸡创天清航科技发展有限责任公司,陕西 宝鸡 721000;3. 东南大学 网络空间安全学院,江苏 南京 211189
Author(s):
XU Kui1HAI Yang1LI Xiao-hui2TAO Jun3
1. Communication Office of Baoji Public Security Bureau,Baoji 721014,China;2. Baoji Chuangtian Qinghang Technology Development Co.,Ltd.,Baoji 721000,China;3. School of Cyber Science and Engineering,Southeast University,Nanjing 211189,China
关键词:
三维目标检测深度估计图像重建自监督学习深度神经网络
Keywords:
3D object detectiondepth estimationimage reconstructionself-supervised learningdeep neural networks
分类号:
TP399
DOI:
10.20165/j.cnki.ISSN1673-629X.2024.0043
摘要:
实现可靠精度的深度估计是三维目标检测方法的关键,该文提出了一种图像深度估计方法。 基于深度学习方法,通过训练深度神经网络,从立体图像的一幅图像中重建另一幅图像实现深度估计,并在训练中采用最小化深度误差替代最小化视差误差,利用立体图像对的几何约束引入左右视图一致性损失实现更加精确的深度估计。 针对图像真实深度数据获取困难、数据集制作成本高的问题,构建了基于图像重建的自监督训练的图像深度估计框架,不需要图像真实深度数据,节省了数据集制作成本;针对深度估计误差随深度的增加急剧增大的问题,采用最小化深度误差替代最小化视差误差,解决了深度估计网络过分强调近处的微小深度误差而忽略远处深度误差的问题。 另外,该文还充分利用了立体图像对的几何约束,在训练中引入左右视图一致性损失来提高深度估计的准确性。 实验验证了提出的图像深度估计方法在性能上优于现有的其他方法,对远处区域和细小目标进行深度估计时具有更好的性能。
Abstract:
Achieving depth estimation with reliable accuracy is the key to 3D target detection methods,and an image depth estimation method is proposed. Based on the deep learning method,depth estimation is achieved by training a deep neural network to reconstruct another image from one image of a stereoscopic image, and minimizing depth error is used instead of minimizing parallax error in training,and the geometric constraint of stereoscopic image pair is used to introduce the left and right view consistency loss to achieve more accurate depth estimation. Aiming at the problem of difficulty in obtaining image real depth data and high cost of dataset production,an image depth estimation framework based on image reconstruction self-supervised training is constructed,which does not require image real depth data and saves the cost of dataset production. Aiming at the problem that the depth estimation error increases sharply with the increase of depth,the minimizing depth error is used instead of minimizing parallax error,which solves the problem that the depth estimation network overemphasizes the small depth error in the near and ignores the depth error in the distance. In addition,we also make full use of the geometric constraints of stereoscopic image pairs,and introduce the loss of left-right view consistency in training to improve the accuracy of depth estimation. Experiments verify that the proposed image depth estimation method outperforms other existing methods,and has better performance when performing depth estimation in distant areas and small targets.

相似文献/References:

[1]汪永宝[],杨红雨[][],兰时勇[][]. 基于置信度传播和色度分割算法的深度估计[J].计算机技术与发展,2015,25(09):6.
 WANG Yong-bao[] YANG Hong-yu[][],LAN Shi-yong[][]. Depth Estimation of Algorithm Based on Belief Propagation and Color Segmentation [J].,2015,25(05):6.
[2]冯春,吴小锋,尹飞鸿,等. 基于局部特征匹配的双焦单目立体视觉深度估计[J].计算机技术与发展,2016,26(10):55.
 FENG Chun,WU Xiao-feng,YIN Fei-hong,et al. Depth Estimation for Bifocal Monocular Stereo Vision Based on Local Image Feature Descriptors Matching[J].,2016,26(05):55.
[3]夏红杰,陈姚节,徐 新,等.动态船舶行驶场景下的实时单目测距算法研究[J].计算机技术与发展,2022,32(02):167.[doi:10. 3969 / j. issn. 1673-629X. 2022. 02. 027]
 XIA Hong-jie,CHEN Yao-jie,XU Xin,et al.Research on Real-time Monocular Ranging in Dynamic Ship Navigation Scene[J].,2022,32(05):167.[doi:10. 3969 / j. issn. 1673-629X. 2022. 02. 027]
[4]朱哲,黄莉,王宗阳.融合感知卷积和注意力机制的全景图深度估计方法[J].计算机技术与发展,2024,34(12):16.[doi:10.20165/j.cnki.ISSN1673-629X.2024.0251]
 ZHU Zhe,HUANG Li,WANG Zong-yang.Depth Estimation in Panoramic Images:Distortion-aware Convolution and Hybrid Attention[J].,2024,34(05):16.[doi:10.20165/j.cnki.ISSN1673-629X.2024.0251]

更新日期/Last Update: 2024-05-10