Temporal And Color Consistent Disparity Estimation In Stereo Videos
In this paper, a novel method for stereo videos disparity estimation is proposed. Temporal consistency in the disparity and color spaces of the video are combined to enhance the disparity estimation algorithm. Outlier detection along the temporal dimension in the color space is used to find areas where disparity can be temporally enhanced from previous and future frames. Moreover, a forward-backward approach is utilized for consistency checking and error correction in cases of fast motion in the video, which usually introduce disparity artifacts. The method can work in real-time even in the hard case of high definition (HD) videos. Experimental results prove that the proposed approach outperforms state of the art methods in a publicly available dataset.