US 12,069,394 B2
Depth map determination method and electronic device to which same method is applied
Changgon Kim, Gyeonggi-do (KR); and Jeongwon Lee, Gyeonggi-do (KR)
Assigned to Samsung Electronics Co., Ltd., Suwon-si (KR)
Filed by Samsung Electronics Co., Ltd., Gyeonggi-do (KR)
Filed on Nov. 24, 2021, as Appl. No. 17/534,545.
Application 17/534,545 is a continuation of application No. PCT/KR2020/005513, filed on Apr. 27, 2020.
Claims priority of application No. 10-2019-0066424 (KR), filed on Jun. 5, 2019.
Prior Publication US 2022/0086309 A1, Mar. 17, 2022
Int. Cl. H04N 5/222 (2006.01); H04N 5/232 (2006.01); H04N 5/341 (2011.01); H04N 23/63 (2023.01); H04N 25/40 (2023.01)
CPC H04N 5/2226 (2013.01) [H04N 23/633 (2023.01); H04N 25/40 (2023.01)] 20 Claims
OG exemplary drawing
 
1. An electronic device comprising:
at least one lens;
an image sensor including first group pixels receiving a part of light that has passed through the at least one lens through a first optical path group and second group pixels receiving another part of the light that has passed through the at least one lens through a second optical path group;
a memory configured to store first point spread function (PSF) feature information including a first asymmetric blur features and a first image height-specific blur features of the first group pixels and second PSF feature information including a second asymmetric blur features and a second image height-specific blur features of the second group pixels with respect to a plurality of focus positions of the at least one lens; and
a processor operatively connected to the image sensor, and the memory, wherein the memory stores instructions executable by the processor to cause the electronic device to:
obtain a first image using the first group pixels and a second image using the second group pixels;
generate first correction candidate image based on the first image and the second PSF feature information;
generate second correction candidate image based on the second image and the first PSF feature information;
calculate matching costs between a first region of interest (ROI) in a pixel basis of the first correction candidate image and a second ROI in the pixel basis of the second correction candidate image; and
determine a depth map corresponding to the matching costs between the first ROI and the second ROI.