US 12,437,409 B2
Method for processing images, electronic device, and storage medium
Jung-Hao Yang, New Taipei (TW); Chin-Pin Kuo, New Taipei (TW); and Chih-Te Lu, New Taipei (TW)
Assigned to HON HAI PRECISION INDUSTRY CO., LTD., New Taipei (TW)
Filed by HON HAI PRECISION INDUSTRY CO., LTD., New Taipei (TW)
Filed on Aug. 26, 2022, as Appl. No. 17/896,842.
Claims priority of application No. 202210369448.5 (CN), filed on Apr. 8, 2022.
Prior Publication US 2023/0326029 A1, Oct. 12, 2023
Int. Cl. G06T 7/11 (2017.01); H04N 13/271 (2018.01)
CPC G06T 7/11 (2017.01) [H04N 13/271 (2018.05)] 20 Claims
OG exemplary drawing
 
1. A method for processing images implemented in an electronic device comprising:
obtaining images when a vehicle is moving, the images comprising at least one left image and at least one right image;
obtaining instance segmentation images by performing an instance segmentation process on the images, the instance segmentation images comprising left images of instance segmentation corresponding to the left images and right images of instance segmentation corresponding to the right images;
obtaining a predicted disparity map by reconstructing the left images based on a pre-established autoencoder;
generating a first error value of the autoencoder for processing the images according to the at least one left image, the predicted disparity map, and the at least one right image, and generating a second error value of the autoencoder for processing the instance segmentation images according to the left images of instance segmentation, the predicted disparity map, and the right images of instance segmentation;
establishing an autoencoder model by adjusting the autoencoder according to the first error value and the second error value;
obtaining a test monocular image as the vehicle is moving, and obtaining a target disparity map by reconstructing the test monocular image according to the autoencoder model; and
obtaining a depth image corresponding to the test monocular image by converting the target disparity map.