US 11,747,896 B2
Methods and systems of extended reality environment interaction based on eye motions
R Balaji, Bengaluru (IN); Sai Durga Venkat Reddy Pulikunta, Andhra Pradesh (IN); Jeffry Copps Robert Jose, Tamil Nadu (IN); and Arun Kumar T V, Bengaluru (IN)
Assigned to Rovi Guides, Inc., San Jose, CA (US)
Filed by Rovi Guides, Inc., San Jose, CA (US)
Filed on Oct. 20, 2020, as Appl. No. 17/75,227.
Prior Publication US 2022/0121275 A1, Apr. 21, 2022
Int. Cl. G06T 19/00 (2011.01); G06F 3/01 (2006.01); G06T 19/20 (2011.01); G10L 15/22 (2006.01); G06V 20/20 (2022.01)
CPC G06F 3/013 (2013.01) [G06T 19/006 (2013.01); G06T 19/20 (2013.01); G06V 20/20 (2022.01); G10L 15/22 (2013.01); G10L 2015/223 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method for extended reality environment interaction, comprising:
generating for display an extended reality environment comprising an object;
detecting, by using a first sensor, that a gaze has shifted from a first portion of the extended reality environment to a second portion of the extended reality environment, the object being excluded from the first portion of the extended reality environment and included in the second portion of the extended reality environment;
in response to detecting the gaze shift, generating for display within the extended reality environment an indicator of the shift in the gaze;
detecting, by using a second sensor, a voice command while the indicator is in a vicinity of the object, wherein the voice command comprises an indication that the gaze is directed at a different portion of the extended reality environment than the second portion of the extended reality environment associated with the displayed indicator of the shift in the gaze; and
in response to detecting the voice command, executing an action corresponding to the voice command.