US 12,436,608 B2
Device and method with gaze estimating
Weiming Li, Beijing (CN); Qiang Wang, Beijing (CN); Hyun Sung Chang, Suwon-si (KR); Jiyeon Kim, Suwon-si (KR); Sunghoon Hong, Suwon-si (KR); and Lin Ma, Beijing (CN)
Assigned to Samsung Electronics Co., Ltd., Suwon-si (KR)
Filed by Samsung Electronics Co., Ltd., Suwon-si (KR)
Filed on Nov. 30, 2022, as Appl. No. 18/072,237.
Claims priority of application No. 202111463213.4 (CN), filed on Dec. 2, 2021; and application No. 10-2022-0135197 (KR), filed on Oct. 19, 2022.
Prior Publication US 2023/0176649 A1, Jun. 8, 2023
Int. Cl. G06T 7/70 (2017.01); G06F 3/01 (2006.01); G06V 10/44 (2022.01); G06V 10/62 (2022.01); G06V 10/74 (2022.01); G06V 10/771 (2022.01); G06V 40/16 (2022.01)
CPC G06F 3/013 (2013.01) [G06T 7/70 (2017.01); G06V 10/454 (2022.01); G06V 10/62 (2022.01); G06V 10/761 (2022.01); G06V 10/771 (2022.01); G06V 40/171 (2022.01); G06T 2207/30201 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method performed by an electronic device, the method comprising:
obtaining target information of an image, the image comprising an eye;
obtaining a target feature map representing information on the eye in the image, by extracting features from a first feature map of at least two frame images and the target information based on an offset between pixels of a face in the image and a first front image obtained by offsetting the pixels of the face in the image and applying a facial mask covering a region other than the face in the image to the image; and
performing gaze estimation for the eye in the image based on the target feature map,
wherein the target information comprises either attention information on the image, or a distance between pixels in the image, or both,
wherein the attention information comprises temporal relationship information between the at least two frame images and frontal facial features of the face or a head, and
wherein the frontal facial features are determined based on obtaining a facial map and the facial mask of the image.