US 11,868,523 B2
Eye gaze classification
Ivana Tosic Rodgers, Redwood City, CA (US); Sean Ryan Francesco Fanello, San Francisco, CA (US); Sofien Bouaziz, Los Gatos, CA (US); Rohit Kumar Pandey, Mountain View, CA (US); Eric Aboussouan, Campbell, CA (US); and Adarsh Prakash Murthy Kowdle, San Francisco, CA (US)
Assigned to GOOGLE LLC, Mountain View, CA (US)
Filed by GOOGLE LLC, Mountain View, CA (US)
Filed on Jul. 1, 2021, as Appl. No. 17/305,219.
Prior Publication US 2023/0004216 A1, Jan. 5, 2023
Int. Cl. G06F 3/01 (2006.01); G02B 27/00 (2006.01); G02B 27/01 (2006.01); G06F 18/23 (2023.01); G06V 20/20 (2022.01); G06V 40/18 (2022.01); G06N 3/08 (2023.01)
CPC G06F 3/013 (2013.01) [G02B 27/0093 (2013.01); G02B 27/0101 (2013.01); G02B 27/0172 (2013.01); G06F 18/23 (2023.01); G06N 3/08 (2013.01); G02B 2027/0138 (2013.01); G02B 2027/0178 (2013.01)] 21 Claims
OG exemplary drawing
 
1. A method comprising:
receiving image data representing at least one image of an eye of a user looking at a display at an instant of time, the display including a plurality of regions and being configured to operate in an augmented reality (AR) application, each of the plurality of regions including a plurality of pixels and corresponding to a respective element of a user interface;
identifying, based on the image data, a region of the plurality of regions of the display at which a gaze of a user is directed at the instant of time, the identifying including inputting the at least one image of the eye of the user into a classification engine configured to classify the gaze as being directed to one of the plurality of regions; and
activating an element of the user interface to which the identified region corresponds.