Learning stereo disparity with feature consistency and confidence
by Liaoying Zhao; Jiaming Li; Jianjun Li; Yong Wu; Shichao Cheng; Zheng Tang; Guobao Hui; Chin-Chen Chang
International Journal of Ad Hoc and Ubiquitous Computing (IJAHUC), Vol. 39, No. 1/2, 2022

Abstract: Most of the existing stereo matching methods have been formulated into four regular parts: feature extraction (FE), cost calculation (CC), cost aggregation (CA), and disparity refinement (DF). They can obtain high precision results in most regions through modifying parts of the four methods, but still have problems in some ill-posed regions. This paper focuses on feature consistency and confidence (FCC), discovers the new attributes of the feature, and proposes a novel neural network structure for stereo matching by measuring the consistency and confidence of features. Base on this method, the paper fuses the cost volume and calculates the pixel confidence map for cost calculation and cost aggregation. The experimental results show the proposed method outperforms most of the state-of-the-art methods on both SceneFlow and Kitti benchmarks and lowers the estimation error of stereo matching down to 1.82% ranking at the 7th position in the Kitti 2015 scoreboard six months ago (http://www.cvlibs.net/datasets/kitti/).

Online publication date: Fri, 18-Feb-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Ad Hoc and Ubiquitous Computing (IJAHUC):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com