CPC G06F 3/013 (2013.01) [G02B 27/0093 (2013.01); G02B 27/0101 (2013.01); G02B 27/0172 (2013.01); G06F 18/23 (2023.01); G06N 3/08 (2013.01); G02B 2027/0138 (2013.01); G02B 2027/0178 (2013.01)] | 21 Claims |
1. A method comprising:
receiving image data representing at least one image of an eye of a user looking at a display at an instant of time, the display including a plurality of regions and being configured to operate in an augmented reality (AR) application, each of the plurality of regions including a plurality of pixels and corresponding to a respective element of a user interface;
identifying, based on the image data, a region of the plurality of regions of the display at which a gaze of a user is directed at the instant of time, the identifying including inputting the at least one image of the eye of the user into a classification engine configured to classify the gaze as being directed to one of the plurality of regions; and
activating an element of the user interface to which the identified region corresponds.
|